The public is taking unreasonable risks with Tesla’s full self-driving beta test. And these risks could potentially be deadly. For those who don’t know, the full self-driving beta test is a rollout of software which allows select users the use of autopilot on non-highway streets.
FSD beta rollout happening tonight. Will be extremely slow & cautious, as it should.
Michael Zimmerman’s Prentice Capital is having a strong year
Prentice Capital was up 15.3% net last month, bringing its year-to-date gain to 49.4% net. Prentice touted its ability to preserve capital during market downturns like the first quarter of this year and the fourth quarter of 2018. Q3 2020 hedge fund letters, conferences and more Background of Prentice Capital The fund utilizes a low Read More
— Elon Musk (@elonmusk) October 20, 2020
Real People Will Now Test Tesla's Self-Driving Beta Update
Customers with Tesla’s Early Access Program will received the update. This effectively allows these users to access the autonomous autopilot system for city streets. As The Verge stated, “the early access program is used as a testing platform to help iron out software bugs.”
Iron out software bugs. On city streets. With real people who never consented to this science experiment. And the people running the science experiment? Normal, untrained customers of Tesla. If that doesn’t sound like a science project gone mad, watch this video.
Take 41 seconds and watch this video.
— TC (@TESLAcharts) October 25, 2020
And if you still don’t think this is crazy check out this video, where a beta test car stops in the middle of an intersection, causing a car behind it to honk its horn.
"Full Self Driving" beta test car stops in the middle of an intersection, causing the car behind it to honk. Then the #Tesla cuts across a solid white line to make the turn.
(Video of Kristen Yamamoto) cc: @Tweetermeyer $TSLA $TSLAQ pic.twitter.com/jOB9O26nrC
— Greta Musk (@GretaMusk) October 25, 2020
Look, we are not haters of technology. And to be frank, the development of a self-driving car is a wonderful technology and complete game changer. But rolling a software beta test out on non-consenting drivers and using random untrained customers to run the experiment is just reckless.
As Zero Hedge reported in March in their article titled: Attention NHTSA: Second Tesla In A Week Has Plowed Through Storefront In Coachella Valley. It is only a matter of time before a Tesla in autopilot mode will crash and kill someone. Tesla’s autopilot does not prevent accidents. Show me high tech software and I will show you a bug that will infect it.
Brian W. Kernighan stated this perfectly, “Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it.”
Tesla’s full self-driving beta test? It’s a mad science experiment gone wrong.
This article first appeared on The Stonk Market