Google’s self-driving cars are getting into accidents, but the Internet firm claims those accidents to be the fault of the human drivers fault. Still, Google is making efforts to teach its cars to behave like humans. A professor at the Massachusetts Institute of Technology (MIT) questions Google’s vision of self-driving cars, saying making cars that are entirely self-driving is a silly idea.
Complete automation may not improve humanity
David Mindell, in an interview with MIT News, said that not a single shred of evidence exists that proves humanity’s lot will improve with complete automation. “We need to rethink the notion of progress, not as progress toward full autonomy, but as progress toward trusted, transparent, reliable, safe autonomy that is fully interactive: The car does what I want it to do, and only when I want it to do it,” the MIT professor said.
Some people have adopted a utopian view that if all cars stick to a speed limit and behave generously with each other, then there will be perfect order. Mindell, the author of Our Robots, Ourselves: Robotics And The Myth Of Autonomy, which was published Tuesday, said it is a very peculiar dream. It is possible that self-driving cars reduce the workload on overwrought humans, but it will be limited, the professor said.
Google a little old-fashioned
Talking of Google, Mindell said Google is a little old-fashioned, and its idea of complete automation has a touch of the 20th Century in it. “The notion of ceding control of something as fundamental to life as driving to a big, opaque corporation — people are not comfortable with that,” Mindell said.
Supporting his argument with an example, Mindell said that that there was a time when unmanned fully automated submersibles were thought to be perfect for underwater exploration. But later, it was observed that they need humans to guide and control them. When asked about NASA’s mission to the moon, Mindell said that earlier it was thought that automation would take care of everything, but despite that, astronauts have to make vital inputs into the steering of spacecraft.
Commercial air travel is the most common example. According to Mindell, a lot of highly technical systems exist, but they are not perfect and need human help to hold the system together.