With a few thousand miles behind the, Google Inc (NASDAQ:GOOG) (NASDAQ:GOOGL)’s self-driving cars have a perfect driving record. But their mere presence on the road raises the question, who’s responsible if one of them makes a mistake? Google says that it’s happy to pay any tickets that its cars might get, but with automation becoming increasingly affordable, we can’t expect every company to be so benevolent (or highly profitable and brand conscious as the case may be).
“A person, if it is defined as a human person and not a corporation, that’s what we’re really wondering about,” said Ron Medford, safety director for Google Inc (NASDAQ:GOOG) (NASDAQ:GOOGL)’s self-driving car program, and a former deputy administrator of the National Highway Traffic Safety Administration, AllGov reports. “Even in this definition… does a person mean a human individual or can it mean something more?”
Google’s self-driving cars: California law requires a person in the driver’s seat, even if driving is impossible
California law requires a person to be in the driver’s seat at all times, which made perfect sense in the very recent past but seems odd when the person in that seat has little or no control over the vehicle itself. If law enforcement follows Google Inc (NASDAQ:GOOG) (NASDAQ:GOOGL)’s lead and decides that the operator doesn’t have to be an actual human, does that mean Google has to get a driver’s license? People have to prove that they are safe behind the wheel, it seems fair that a self-driving car should have to undergo a similar battery of tests. No doubt Google has spent a lot of time and money making their cars safe, but that doesn’t mean other companies will follow suit.
The law needs to catch up with ubiquitous automation
Manufacturers can be liable if they sell or fail to recall cars with safety defects that they are aware of (ask General Motors Company (NYSE:GM) for details), and the same should be true for self-driving cars, but that still doesn’t address the issue of who pays for running a red light or who is responsible for vehicular manslaughter in case of a grisly accident. The most straightforward option is to make the person on-hand responsible for what’s going on, and require manufacturers to give drivers the option of taking control if necessary, treating the self-driving algorithms like high-end cruise control, but there’s no telling which way the law will evolve.
If all this seems like navel-gazing, bear in mind that five years ago no one would have expected a search company to be building driverless cars or a restaurant in India to deliver pizza via drone. Robots are here, and the law needs to catch up.