Autopilot “is not a self-driving technology and does not replace the driver,” Tesla said in response to a 2020 case filed in Florida. “The driver can and must still brake, accelerate and steer just as if the system is not engaged.”
Tesla’s terminology is so confusing. If “Autopilot” isn’t self-driving technology, does that mean it’s different from “Full Self Driving”? And if so, is “Full Self Driving” also not a self-driving technology?
I heard Elon Musk call it: “Assisted full self driving”. Which doesn’t make any sense. LOL
“It’s called whatever will make the stock price go up.”
And avoid liability.
But you repeat yourself
its called “Full Self Driving (Supervised)” now
If it really was full self driving, it wouldn’t need to be supervised
The term autopilot comes from aviation, where the only kind of problem resolution an autopilot does is turning itself off.
Other than that, it just flies from checkpoint to checkpoint.
If only we could implement similar testing protocols to the aviation version to validate it’s safety!
A full NTSB investigation for every single crash? I’m all for it!
Depends on the autopilot. There are some that are as rudimentary as a “wing leveler.” They only have control of the ailerons and can level the wings and maybe make turns. Other systems have control of all three major control axes and are integrated with the navigation systems so they can do things like climb to an altitude and level off, turn to a heading, or even fly holds and approaches.
They do require training on the part of the pilot to use in flight.
Yeah, but even the best ones would happily crash into a mountain if the pilots don’t set their altimeters properly (and ignore the terrain warnings).
Are you sure that it is happy? Maybe the autopilot is really sad about its inability to not fly into a mountain.
Hard to say, it might depend on the plane model. I’ve heard that Boeing 777s autopilots are really snarky.
It’s marketing
You can’t call something Full Self Driving or Autopilot and then blame the driver. If you want to blame the driver then call it drive asist.
Right! That’s why you have the FSD turn it over to the driver the moment a crash is unavoidable to make the driver liable.
“at the time of the crash, the driver was in full control”
(but not a couple seconds before)
I think Tesla should rename Auto Pilot to Darwin Award Mode.
And improve motorcycle detection as well as use LIDAR.
It’s not that Teslas are killing their owners. Teslas are killing first responders to road accidents, kids getting off buses and motorcyclists. We’re all exposed to the problems caused by Musk cutting out testing to save some money.
The customers pay extra in order to be beta testers. Best deal ever!
And pay more than my first two cars cost me combined at that
That’s just the price we have to pay for this wonderful capitalist system. Worth it!
I like calling it cruise control with extra fatalities.
Heck, even using the same sonar/radar/whatever normal cars use other than just cameras would be a huge improvement
Here is an alternative Piped link(s):
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source; check me out at GitHub.
This is the best summary I could come up with:
SAN FRANCISCO — As CEO Elon Musk stakes the future of Tesla on autonomous driving, lawyers from California to Florida are picking apart the company’s most common driver assistance technology in painstaking detail, arguing that Autopilot is not safe for widespread use by the public.
Evidence emerging in the cases — including dash-cam video obtained by The Washington Post — offers sometimes-shocking details: In Phoenix, a woman allegedly relying on Autopilot plows into a disabled car and is then struck and killed by another vehicle after exiting her Tesla.
Late Thursday, the National Highway Traffic Safety Administration launched a new review of Autopilot, signaling concern that a December recall failed to significantly improve misuse of the technology and that drivers are misled into thinking the “automation has greater capabilities than it does.”
The company’s decision to settle with Huang’s family — along with a ruling from a Florida judge concluding that Tesla had “knowledge” that its technology was “flawed” under certain conditions — is giving fresh momentum to cases once seen as long shots, legal experts said.
In Riverside, Calif., last year, a jury heard the case of Micah Lee, 37, who was allegedly using Autopilot when his Tesla Model 3 suddenly veered off the highway at 65 mph, crashed into a palm tree and burst into flames.
Last year, Florida Circuit Judge Reid Scott upheld a plaintiff’s request to seek punitive damages in a case concerning a fatal crash in Delray Beach, Fla., in 2019 when Jeremy Banner and his Tesla in Autopilot failed to register a semi truck crossing its path.
The original article contains 1,850 words, the summary contains 263 words. Saved 86%. I’m a bot and I’m open source!
deleted by creator