A Florida judge found “reasonable evidence” that Tesla Chief Executive Elon Musk and other managers knew the automaker’s vehicles had a defective Autopilot system but still allowed the cars to be driven unsafely, according to a ruling.

    • logicbomb@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      ·
      1 year ago

      I purchased an FSD Model 3 a long time ago. For the life of me, I can’t understand why it is still so bad today. How hard is it to stay in the left lane when you’re just about to take a left?

      However, that being said, it’s just a tool. The first time I drove (other) cars with cruise control or GPS, I knew I didn’t want to drive without those features anymore, even if it was years before they started working very well. I feel the same thing about FSD, even in its current poor state.

      I know that it’s been advertised to be something it’s not, and I think Tesla needs to lose a lot of lawsuits over it, but I need this technology now.

      Just like when cruise control and GPS came out, there were people who wrecked because they relied on them too much, the same thing is happening with new technology in a Tesla. There is no autonomous driving mode. The driver is supposed to be in control and paying attention the entire time.

      • Traister101@lemmy.today
        link
        fedilink
        English
        arrow-up
        9
        ·
        edit-2
        1 year ago

        Tesla advertised Full Self Driving, it’s literally what FSD stands for but in reality it was a somewhat better driver assistance. Why was? Well they are now relying almost exclusively on on normal fucking cameras which is why their depth perception is so spotty, they don’t have depth perception anymore the hardware for it has been removed.

        • logicbomb@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          It’s still somewhat better driver assistance. Like I said, it was advertised to be something it’s not. My car has the lidar, and I can tell you from personal experience that it’s much better at driving assist now without the lidar than it was with the lidar. That may be due to other improvements, but the point stands.

          When you use the “FSD” mode, you free up a ton of your attention. You don’t have to concentrate on staying in the lane. Lots of cars have lane assist now, and it’s definitely the number one feature on the Tesla, as well. You don’t have to focus as much on speed. You rarely have to think about navigation. It does the lane changes for you, and it does the turning for you. You can glance at the display, and it very reliably shows you where all of the cars are near you.

          You get to reclaim all of that attention and with that, you can be better aware of what other cars are doing. I was already a very safe driver, the sort who focused on defensive driving, before I got FSD, and FSD has only made my drive even safer. My biggest complaint is that it makes some really stupid lane change decisions, which I can simply cancel. Of course, that’s after it has turned on the signal, which feels embarrassing, although probably nobody else cares. It also has some issues when there are multiple turn lanes. It likes to choose the stupidest turn lane every time.

          You use the car. You learn the quirks of the current software, and then you correct when it does something wrong. That’s it. It’s game changing. It’s not as game changing as true FSD, but it’s huge. It doesn’t matter if it only uses cameras. We drive with only our eyes. Could it be better if they also used lidar? Probably, although AI famously can have worse results sometimes with more inputs. But it’s the other things that are more important.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    14
    ·
    1 year ago

    This is the best summary I could come up with:


    Nov 21 (Reuters) - A Florida judge found “reasonable evidence” that Tesla Chief Executive Elon Musk and other managers knew the automaker’s vehicles had a defective Autopilot system but still allowed the cars to be driven unsafely, according to a ruling.

    Judge Reid Scott, in the Circuit Court for Palm Beach County, ruled last week that the plaintiff in a lawsuit over a fatal crash could proceed to trial and bring punitive damages claims against Tesla for intentional misconduct and gross negligence.

    Bryant Walker Smith, a University of South Carolina law professor, called the judge’s summary of the evidence significant because it suggests “alarming inconsistencies” between what Tesla knew internally, and what it was saying in its marketing.

    “This opinion opens the door for a public trial in which the judge seems inclined to admit a lot of testimony and other evidence that could be pretty awkward for Tesla and its CEO,” Smith said.

    The judge said the accident is “eerily similar” to a 2016 fatal crash involving Joshua Brown in which the Autopilot system failed to detect crossing trucks, leading vehicles to go underneath a tractor trailer at high speeds.

    “It would be reasonable to conclude that the Defendant Tesla through its CEO and engineers was acutely aware of the problem with the ‘Autopilot’ failing to detect cross traffic,” the judge wrote.


    The original article contains 510 words, the summary contains 223 words. Saved 56%. I’m a bot and I’m open source!

  • orcrist@lemm.ee
    link
    fedilink
    arrow-up
    11
    ·
    1 year ago

    Yes yes, we all knew that. But it’s good that the judge decided that his court should match reality.

  • Siegfried@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    1 year ago

    I recall musk saying something on the lines of “we have to accept that self driving cars will also cause deaths” like 10 years ago or so

    • grayman@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Context is important. If we focused on perfect safety like this for other forms of transportation we’d never fly, take trains, drive cars, ride on boats, or anything else. Self driving has a significantly better safety record than humans. That context is critical. It doesn’t mean that we stop striving to be safer; it just means it’s reasonable to use it in many circumstances.