A self-driving car might require particular kinds of maintenance or to be operated only in certain zones. So, it could be that the software was not responsible for a crash, but the owner is.
Or take this difficult scenario. Say a robotic car swerves to avoid a deer, but in doing so, crashes into another car. If the car did what a good human driver would have, should Google be responsible for damages in this situation?
Full story at www.theatlantic.com.
In related news, Google has just announced they are building a “friendly” driverless car.