Autopilot In Tesla Model X Blamed For A Crash

1
Autopilot In Tesla Model X Blamed For A Crash
Blomst / Pixabay

Tesla’s Autopilot system is now directly being blamed for a serious accident involving a Model X. We’ve just learned of a third accident in which Autopilot was said to have been engaged, and this time, the driver of the Model X is pointing the finger directly at it.

Play Quizzes 4

.

Tesla’s Autopilot system was already being investigated by the National Highway Traffic Safety Administration following a fatal accident in Florida which occurred while Autopilot was engaged. Tesla has also been doing its best to downplay another accident involving a Model S which may have had it engaged, saying that it has nothing to prove that Autopilot was to blame for either of the accidents.

Is First Gen An Overlooked Power Play That Deserves A Re-Rating?

environmental 1651092002The post was originally published here. Highlights: Resolving gas supply issues ensures longevity A pioneer in renewable energy should be future proof Undemanding valuation could lead to re-rating Q1 2022 hedge fund letters, conferences and more

Tesla’s Autopilot blamed for Model X wreck

According to Electrek, this latest accident occurred on a state highway in Whitehall, Montana when a Model X crashed through a guardrail and went off the road. Pictures appear to show that the front passenger side of the hood and the wheel were torn off, although both the driver and passenger were able to escape without serious injury.

A friend of the driver posted pictures of the wrecked car on the Tesla Motors Club forum and described the accident. The person said the Model X was in Autopilot mode and traveling at a speed of 56 to 60 miles per hour when it drove off the road and smashed into the guardrail, which was made of wooden posts.

Is Tesla too early with its Autopilot system?

Based on the forum post, it sounds like the driver claims he was driving with Autopilot on a road that doesn’t have a center divider. This could be a problem for the driver because, according to Electrek, Tesla doesn’t recommend that drivers use the Autopilot’s Autosteer feature on a road that doesn’t have a center divider. Additionally, Tesla’s 7.1 software update places a limit on the speed the Autopilot system will drive to the speed limit of the road, plus 5 miles per hour.

Tesla advises drivers to keep their hands on the wheel at all times when using the Autopilot feature in their Model S or Model X, but it sounds like drivers are disregarding this advice. This comes as no surprise considering all the YouTube videos we’ve seen of drivers sharing videos of themselves doing things they shouldn’t be doing while Autopilot is on, such as shaving and eating. Clearly the world isn’t ready yet for self-driving cars.

Updated on

Michelle Jones is editor-in-chief for ValueWalk.com and has been with the site since 2012. Previously, she was a television news producer for eight years. She produced the morning news programs for the NBC affiliates in Evansville, Indiana and Huntsville, Alabama and spent a short time at the CBS affiliate in Huntsville. She has experience as a writer and public relations expert for a wide variety of businesses. Email her at Mjones@valuewalk.com.
Previous article Hugh Hendry: How We Learned To Sop Worrying And Love ” the Brexit bomb”
Next article Here’s How Much Europe Depends On The UK

No posts to display

1 COMMENT

  1. It is becoming readily apparent that the average motorist, paired with a semi-autonomous vehicle, may be more dangerous than a drunk driver or a driver distracted by a mobile phone or electronic entertainment device. It is also irresponsible for Tesla to tout its steering assist system as an “autopilot”. Autopilots are only relevant to aviation and marine navigation, where separation between the craft and other craft and navigation hazards are large, and the time available for the human operator to take over from the autopilot is on the order of tens of seconds to minutes. On the road, the motorist needs to be capable of taking over in a fraction of a second, and that’s not possible if the motorist is distracted or asleep.

    Tesla’s autopilot system uses only optical cameras to help the autopilot software recognize the path that the vehicle is to travel. If the painted lines on either side of the lane are faint or missing, the software can get confused, unlike a human operator, and fail to keep the car on the pavement. If there is insufficient contrast in the scene the cameras see ahead, it may not discern the lane boundaries or large obstructions like as semitrailers, which may have contributed to the fatal crash of Joshua Brown in May 2016.

Comments are closed.