Tesla seems to be pushing out the next Autopilot feature, and part of it is the “local roads” Autosteer feature that supposedly controls the car at speeds ranging up to 35 miles per hour. It seems some drivers are having such terrible problems with the feature that they don’t feel it is even safe to use. Others are completely fine with the firmware’s shortcomings because they understand that this is only a beta that’s collecting data to learn.
The issues bring to mind the concerns raised by some around the time of the fatal Autopilot accident. Is Autopilot really ready for the average driver?
Complaints about “local roads” feature
In a post on the Tesla forum, someone who claims to be a “Cloud Architect” who is very familiar with technology and software compared the new Autopilot firmware to driving drunk.
“Was very excited today to get Firmware (17.5.36)…. until I tested it,” wrote user Chris.Skaling on Wednesday. “Imagine you go to the bar, you had 6 double shots, and threw back 5 or 6 beers. Then you decide to be an idiot and drive. That’s how the car drives with ‘Local road driving’ AP2. It’s basically not usable.”
He went on to list several problems he experienced with Tesla’s new Autopilot firmware. Among the issues were going through intersections, streets with exit lanes on the right, 90-degree turns on roads, problems driving next to bike lanes, and “overall just unpredictable” general lane keeps.
He also suggested that Tesla should add some warnings about the Autopilot update and that he would be fine with all the problems if warnings were added.
His main suggested warning was: “Generally Autosteer will only work going straight in a clearly marked center lane.”
Not all Tesla owners are upset
But Tesla fans are extremely loyal to the company, as several came to its defense, saying that Autopilot will gradually learn over time as drivers use it and get better.
“I too work in technology and have the same excitement about being part of the process as my soon-to-be-delivered Tesla S90S gradually becomes smarter,” wrote user dsopocy. “I too am concerned that Elon might have to have another sensor design refresh before he can truly reach Level 5 autonomy, but I don’t want to miss out on the journey.”
The user also stated that Skaling may be having “premature regrets.” Others debated whether Tesla’s cars have FSD (full self-driving) features, although Tesla announced the addition of FSD in an October blog post. Skaling himself never even claimed in the original post that his car has FSD. He said further down in the conversation that he was talking about the code in the firmware and how it should be used.
There seem to be more drivers supporting Tesla than supporting Skaling on his forum post, so it will be interesting to see how this new problem shapes up.
Beta testing Autopilot
The key issue here seems to be whether Tesla owners understand that what they have is a car with artificial intelligence that is trying to learn how to drive on local roads, which is much more challenging than driving on highways.
Many early adopters no doubt understand this, but the company did have to button up Autopilot after a fatal accident that killed a Florida driver last year. Regulators cleared it of any wrongdoing in connection with that crash after their investigation placing the blame on the driver. The company did add more warnings and safety features forcing drivers to keep their hands on the steering wheel, but the new problems are sure to renew the debate about whether Tesla is using drivers as guinea pigs to build its Autopilot software.
Google was using professionals to drive its autonomous vehicles rather than your typical average consumer with no knowledge of how artificial intelligence works. However, Tesla should be able to build data faster by being able to canvas a wider variety of roads around the world.
The key is making sure that drivers understand what they’re really getting and whether they can handle testing a feature that’s in beta. This isn’t like beta testing iOS or Android updates. It’s beta testing a vehicle capable of doing real-world damage. Perhaps not every driver should be allowed to beta test Autopilot, no matter how cool and forward-thinking the tech will be when it’s fully developed.