New Footage Shows Tesla On Autopilot Crashing Into Police Car After Alerting Driver 150 Times::Six officers who were injured in the crash are suing Tesla despite the fact that the driver was allegedly impaired

  • @hoodlem@hoodlem.me
    link
    fedilink
    English
    16411 months ago

    In fact, by the time the crash happens, it’s alerted the driver to pay more attention no less than 150 times over the course of about 45 minutes. Nevertheless, the system didn’t recognize a lack of engagement to the point that it shut down Autopilot

    I blame the driver, but if the above is true there was a problem with the Tesla as well. The Tesla is intended to disengage and disable autopilot for the remainder of the drive after a small number of ignored alerts. If the car didn’t do that, there’s a bug in the Tesla software.

    I think it’s more likely the driver used a trick to make the car think he was engaged when he was not. You can do things like put a water bottle wedged in the steering wheel to make the car think you have tugged on the steering wheel to prove you are engaged. (Don’t ask me how I know)

    • @RushingSquirrel@lemm.ee
      link
      fedilink
      English
      3411 months ago

      After 3 alerts, it’s off until you park. There are visual cues that precede the alert though and these do not count. I don’t recall how many there are and for how long, but you start by seeing a message asking to have your hands on the wheel, then a blue line at the top, them the line starts pulsing ,then you’ve got an audio alert that is the first strike. Three strikes during the same drive and you need to park before using autopilot again.

  • daikiki
    link
    fedilink
    English
    9311 months ago

    I have a lot of trouble understanding how the NTSB (or whoever’s ostensibly in charge of vetting tech like this) is allowing these not-quite self driving cars on the road. The technology doesn’t seem mature enough to be safe yet, and as far as I can tell, nobody seems to have the authority or be willing to use that authority to make manufacturers step back until they can prove their systems can be integrated safely into traffic.

    • @SpaceNoodle@lemmy.world
      link
      fedilink
      English
      8511 months ago

      It’s just ADAS - essentially fancy cruise control. There are a number of autonomous vehicle companies who are carefully and successfully developing real self-driving technology, and Tesla should be censured and forbidden for labeling their assistance software as “full self-driving.” It’s damaging the real industry.

    • @RushingSquirrel@lemm.ee
      link
      fedilink
      English
      3711 months ago

      That’s similar to cruise control. Cruise control can be dangerous because someone could fall asleep (not having to manage your speed can afford up sleepiness) and the car wouldn’t slow down.

      In my opinion, those options are all the driver’s responsibility to know their own limit and understand that the tool is just a tool and you are responsible to making sure your driving is safe for others. Tesla autopilot adds a ton of safety features that avoid a lot of collisions based on lacking attention, sleepiness, and actively avoiding other drivers faults. But it’s still just a tool and the driver is responsible of their own car and driving.

      • daikiki
        link
        fedilink
        English
        3711 months ago

        The difference is that cruise control will maintain your speed, but ‘autopilot’ may avoid or slow down for obstacles. Maybe it avoids obstacles 90% of the time or 99% of the time. It apparently avoids obstacles enough that people can get lulled into a false sense of security, but once in a while it slams into the back of a stationary vehicle at highway speed.

        It’s easy to say it’s the driver’s responsibility, and ultimately it is, of course, but in practice, a system that works almost all of the time but occasionally causally kills somebody is very dangerous indeed, and saying it’s all the driver’s fault isn’t really realistic or fair.

        • @abhibeckert@lemmy.world
          link
          fedilink
          English
          20
          edit-2
          11 months ago

          A lot of modern cruise control systems will match the speed of the car in front of you and stop if they stop. They’ll also keep the car in the current lane. And even without cruise control, most modern cars will stop if a pedestrian steps onto the road.

          It’s frustrating that Tesla’s system can’t detect a stationary police car in the middle of the road… but at the same time apparently that’s quite a difficult thing to do and it’s not unique to Tesla.

          It’s honestly not too much to ask a driver to step on the brakes if there’s a cop car stopped on the road.

          • @SpaceNoodle@lemmy.world
            link
            fedilink
            English
            3
            edit-2
            11 months ago

            It’s actually not that hard to do, but Tesla is not willing to spend the necessary time and resources to solve the hard problems.

        • @ilickfrogs@lemmy.world
          link
          fedilink
          English
          12
          edit-2
          11 months ago

          Actually it’s absolutely realistic and fair. I don’t like Musk, or Tesla for that matter. But they make it pretty damn clear that you’re 100% responsible for the vehicle when using that feature. Anyone who assumes they don’t need to pay attention is a moron and should be held responsible. If a 747 autopilot system starts telling the pilot to take control of the plane and they don’t… we wouldn’t blame the manufacturer, we’d blame the shitty pilot that didn’t do their job.

          • @ShittyBeatlesFCPres@lemmy.world
            link
            fedilink
            English
            2411 months ago

            I can’t wait to get smacked by a Tesla beta tester and have everyone debate whether the car or the driver is responsible for my innards being spread across 4 lanes. Progress!

          • daikiki
            link
            fedilink
            English
            911 months ago

            If the driver gets lulled into a false sense of security by a convenience system like this and the automation fails, it’s one thing to blame the driver, and that may or may not be fair depending on how much trust you place in the average driver’s competence, but the (hypothetical) victim is still dead, and who we decide to blame won’t make one iota of difference to that.

  • @zerbey@lemmy.world
    link
    fedilink
    English
    7911 months ago

    150 more warnings than a regular car would give, ultimately it’s the driver’s fault.

      • @CmdrShepard@lemmy.one
        link
        fedilink
        English
        211 months ago

        Do we have any evidence from the driver stating that he didn’t realize he was using a glorified cruise control similar to autopilot on an airplane?

        • @wearling0600@lemmy.world
          link
          fedilink
          English
          411 months ago

          Where I live you can right now go to Tesla’s website and buy a car with “Full Self-Driving Capability” with a small print that includes the disclaimer that it doesn’t make the vehicle autonomous, for whatever that’s worth…

            • @wearling0600@lemmy.world
              link
              fedilink
              English
              411 months ago

              Ah I see, now that you’ve been proven wrong you’re pretending you asked a different question.

              You admit that Tesla advertises a “Full Self-Driving Capability” feature, which is basically what the person you said “source or stfu” to.

              Whether or not the feature was used in this instance is not what we’re discussing here.

              We can have this discussion if you’re feeling like you’re up for it in good-faith, I think both are true that people are overall terrible at the activity of driving so more driver aids are overall better, but also current driver aids are very limited and drivers are not necessarily great at understanding and working within those limits.

              They’re not the only ones, but Tesla is really the worst offender at overstating their cars’ capabilities and setting people up for failure - like in this case.

      • @sugartits@lemmy.world
        link
        fedilink
        English
        1411 months ago

        The driver was responding. If he didn’t respond the car would have stopped.

        If this was a normal car he probably would have just crashed earlier.

  • @CaptainProton@lemmy.world
    cake
    link
    fedilink
    English
    74
    edit-2
    11 months ago

    This is stupid. Teslas can park themselves, they’re not just on rails. It should be pulling over and putting the flashers on if a driver is unresponsive.

    That being said, the driver knew this behavior, acted with wanton disregard for safe driving practices, and so the incident is the driver’s fault and they should be held responsible for their actions. It’s not the courts job to legislate.

    It’s actually the NTSB’s job to regulate car safety so if they don’t already have it congress needs to grant them the authority to regulate what AI behavior is acceptable/define safeguards against misbehaving AI.

    • @socsa@lemmy.ml
      link
      fedilink
      English
      1011 months ago

      There’s no way the headline is true. Zero percent. The car will literally do exactly what you stated if it goes too long without driver engagement and I’ve experienced it first hand.

      • @doggle@lemmy.world
        link
        fedilink
        English
        311 months ago

        The headline doesn’t state that the warnings were consecutive.

        Perhaps the driver was just aware enough to keep squelching warnings and prevent the car from stopping altogether?

        I’ll grant you, though, 150 warnings is still a little tough tough to believe…

      • @lapommedeterre@lemmy.world
        link
        fedilink
        English
        311 months ago

        Evidently, he was aware enough to respond to the alerts, per the logs (as stated in the WSJ video that’s in the article). It shows a good bit of the footage, too.

        Seems like they need something better for awareness checking than just gripping the wheel and checking where your eyes are pointed. And obviously better sensors for object recognition.

      • @limelight79@lemm.ee
        link
        fedilink
        English
        711 months ago

        I turned off the “lane assist” in our Mazda because it kept steering me back toward obstacles I was trying to avoid, like cyclists, oversized loads, potholes, etc. I don’t know why anyone thought that was a good idea.

        But try buying a car without those features now…sigh.

          • @doggle@lemmy.world
            link
            fedilink
            English
            311 months ago

            If you’re swerving to avoid a sudden obstacle you reasonably may not have the foresight or reaction to flip on a signal. The car still needs to not force you back on collision course.

            • @Grabthar@lemmy.world
              link
              fedilink
              English
              111 months ago

              That’s a good point, and is probably why they designed it so that if you swerve hard, lane assist shuts off. It only nudges you back to the middle of the lane if you are gently drifting to a side, so it only works in situations where your turn signal can be used to avoid it. Or you can just disable it if you drive a BMW or otherwise can’t use turn signals.

          • @limelight79@lemm.ee
            link
            fedilink
            English
            111 months ago

            Even moving over slightly in the lane to avoid a pothole triggers it; it doesn’t seem like a turn signal should be necessary in that situation. Instead the situation seems to be that I’m seeing the pothole and altering the car’s course gently to avoid it, and I get close to the line and it freaks out.

            I guess if I drove right up to the obstacle then swerved, it wouldn’t do it…but I was always taught swerving was a last-resort thing, best to drive as smoothly as possible. (This was my dad’s argument, and I said, “Uh, SOMEONE taught me to not swerve unless it was necessary…” (him). He laughed.

    • @chris2112@lemmy.world
      link
      fedilink
      English
      611 months ago

      The driver is responsible for this accident, Tesla still should be liable imo for all the shady and outright misleading advertising around their so called “self driving”. Compare Tesla’s marketing to like GMs of Hyundai’s, both of which essentially have parity with Teslas system in terms of actual features, and you’ll see a big difference

    • @dzire187@feddit.de
      link
      fedilink
      English
      511 months ago

      It should be pulling over and putting the flashers on if a driver is unresponsive.

      Yes. Actually, just stopping in the middle of the road with hazard lights would be sufficient.

    • @doggle@lemmy.world
      link
      fedilink
      English
      311 months ago

      Sounds like the injured officers are suing. It’s a civil case not criminal, so I’m not sure how much the court would actually be asked to legislate. I’d be interested to hear their arguments, though I’m sure part of their reasoning for suing Tesla over the driver is they have more money.

  • Jeena
    link
    fedilink
    English
    5711 months ago

    So if the guy behind the wheel died and couldn’t react to the alerts then the car can’t do a decision to just stop instead of crashing into a police car?

      • @Wats0ns@programming.dev
        link
        fedilink
        English
        811 months ago

        Isn’t that in purpose tho ? Like “hey if we’re not sure to be able to break on time, just disengage so it’s not our responsibility anymore”?

        • iWidji
          link
          fedilink
          English
          3
          edit-2
          11 months ago

          If we want to get really technical, the NSTB is requiring all new cars to have emergency braking so in this situation, the car should slam on the brakes. Even if it can’t slow down fast enough to prevent a crash, it should slow down enough to minimize it.

          Is this particular Tesla under said law? Probably not. But I think we can see why this tactic is the infinitely safer and more ethical than saying “good luck, control this car on your own or enjoy this 100 km crash otherwise”

          • @tony@lemmy.hoyle.me.uk
            link
            fedilink
            English
            211 months ago

            Tesla has AEB but by the time something like that triggers you’re reducing the severity of the crash not eliminating it.

            It’s likely the car braked at 100km/h but was still doing 50 when it hit… at those speeds it’s fatal whatever happens.

    • @pec@sh.itjust.works
      link
      fedilink
      English
      12
      edit-2
      11 months ago

      He was reacting to alerts, complying to them by simply touching the steering wheel. He did that 150 times during that 45 minute trip ( not all the trip was on auto pilot).

      So if the guy died the car would of disengaged auto pilot (I’m not sure how this works).

      You can check the video in the article. It’s quite informative .

      Edit

      I saw another video and it takes ~60 seconds after taking off your hand from the steering wheel for the car to safely come to a full stop.

        • @Landmammals@lemmy.world
          cake
          link
          fedilink
          English
          211 months ago

          Was he drunk? The article seems to use the fact that the car nagged him 150 times as evidence that he was impaired.

      • @tony@lemmy.hoyle.me.uk
        link
        fedilink
        English
        111 months ago

        TBH if you’re not used to it the steering wheel check can warn frequently. It’s checking for a small amount of torque on the wheel rather than actually holding it (as there are no pressure sensors) and that catches people out but the prompt says to put your hands on the wheel… I could believe 150 times on a long journey.

  • @thatKamGuy@sh.itjust.works
    link
    fedilink
    English
    5211 months ago

    Driver is definitely the one ultimately at fault here, but how is it that Tesla doesn’t perform an emergency stop in this situation - but just barrels into an obstacle?

    Even my relatively ‘dumb’ car with adaptive cruise control handles this type of situation better than Tesla?!

    • @Obi@sopuli.xyz
      link
      fedilink
      English
      511 months ago

      You’re completely right and I’ve never seen this for traffic stops in Europe, they’ll make you park somewhere safe, at the very worst, in the emergency lane, but even that is rare for traffic stops. The only times I see lanes blocked is when there’s been an accident/breakdown and then the first thing they do is bring massive light panels well ahead of the spot to make everyone clear the lane.

  • Jordan Lund
    link
    fedilink
    English
    4611 months ago

    Don’t see how that’s a Tesla problem… Drunk/high driver operating their car incorrectly.

          • Jordan Lund
            link
            fedilink
            English
            511 months ago

            Autopilot doesn’t work that way, the drunk should have known that when he wasn’t drunk and not tried to use it that way.

            It’s like the old shaggy dog story about the guy driving a camper, setting the cruise control, then going into the back to make lunch.

            That’s not the fault of the cruise control.

  • @hark@lemmy.world
    link
    fedilink
    English
    3911 months ago

    Setting aside the driver issue, isn’t this another case that could’ve been prevented with LIDAR?

  • @Md1501@lemmy.world
    link
    fedilink
    English
    3511 months ago

    You know what might work, program the car so that after the second unanswered “alert” the autopilot pulls the car over, or reduces speed and turns on the hazards. The third violation of this auto pilot is disabled for that car for a period of time.

    • @HalcyonReverb@midwest.social
      link
      fedilink
      English
      1511 months ago

      I drive a Ford Maverick that is equipped with adaptive cruise control, and if I were to get 3 “keep your hands on the wheel” notifications, it deactivates adaptive cruise until the vehicle is completely turned off and on again. It blew my mind to learn that Tesla doesn’t do something similar.

      • @tony@lemmy.hoyle.me.uk
        link
        fedilink
        English
        311 months ago

        It does and did… He kept driving anyway. Drink drivers FTW.

        I presume AEB kicked in but all that can do is reduce the speed of inpact… if you’re determined to kill yourself there’s not much the car can do.

          • @CmdrShepard@lemmy.one
            link
            fedilink
            English
            111 months ago

            The problem with this is what if the car thinks there’s a barrier in front of you but there isn’t? People are arguing that these systems are too intrusive while also arguing that they don’t go far enough to take control away from drivers.

            This situation happened because a drunk driver ran into police cars, something that has been happening for as long as cars have existed.

            • @Obi@sopuli.xyz
              link
              fedilink
              English
              111 months ago

              That’s the issue with current “self driving” systems in a nutshell. We’re in this terrible middle ground right now where these features let careless drivers take their attention away, but not actually be able to control the vehicle safely. We should ban all that crap until actual self driving is viable.

              • @CmdrShepard@lemmy.one
                link
                fedilink
                English
                011 months ago

                How does it become viable if you ban the technology? What we have now is advanced cruise control that protects drivers in some circumstances while having zero effect in others. Drivers were equally dumb and careless long before this technology existed. This new tech doesn’t make that aspect any worse. Banning it now just means more people will crash and more people will be injured.

                • @Obi@sopuli.xyz
                  link
                  fedilink
                  English
                  1
                  edit-2
                  11 months ago

                  Here’s a an article referencing a UK white paper that talks about the issues with level 2 and 3 autonomous vehicles.

                  https://www.tu-auto.com/adas-level-2-3-avs-are-hazards-experts-warn/

                  *“With adaptive cruise control (ACC) for instance, it takes twice the amount of time to respond to a sudden braking event than it does when you are manually driving. Drivers may believe that ACC is safer but actually taking your foot off the accelerator pedal and letting the car make the decisions leads to lower workload and can mean drivers are unprepared for an unexpected event.”

                  University of Sussex object recognition researcher Dr Graham Hole was also questioned for the study and dubs Levels 2 and 3 “the worst of all worlds”. He says: “Human beings are rubbish at being vigilant – vigilance declines after about 20 minutes. With semi-autonomous you are reducing the driver to monitoring the system on the off-chance something goes wrong. Most of the time nothing goes wrong, leading the driver to have massive faith in the system in all conditions, which of course isn’t always the case.”*

    • @Technoguyfication@lemmy.ml
      link
      fedilink
      English
      911 months ago

      This is literally exactly how it works already. The driver must have been pulling on the steering wheel right before it gave him a strike. The system will warn you to pay attention for a few seconds before shutting down. Here’s a video: https://youtu.be/oBIKikBmdN8

        • @stealin@lemmy.world
          link
          fedilink
          English
          511 months ago

          The system with cars is that you don’t distract the driver from driving, having a system that takes over driving is exactly that, so the idea of the system is flawed to begin with.

          • @Technoguyfication@lemmy.ml
            link
            fedilink
            English
            2
            edit-2
            11 months ago

            I have to say this is extremely inaccurate imo. Self driving takes over the menial tasks of keeping the car in the lane, watching the speed, etc. and allows an attentive driver to focus on more high level tasks like looking at the road ahead, watching the sides of the road for potential hazards, and keeping more aware of their blind spots.

            Just because the feature can be abused does not inherently make it unsafe. A drunk driver can use cruise control to more accurately control the vehicle’s speed and avoid a ticket, does that make it a bad feature? I wouldn’t say so.

            Autopilot and other driver assist systems are good when used responsibly and cautiously. It’s frustrating to see people cause an accident after misusing the system and blame the technology instead. This is why we can’t have nice things.

          • VinceUnderReview
            link
            English
            111 months ago

            Screenshotting this because it’s so well put.

        • @Technoguyfication@lemmy.ml
          link
          fedilink
          English
          111 months ago

          You’re misinterpreting what I said and conflating two separate scenarios in your 2nd statement. I didn’t say anything about the system warning “for a few seconds before shutting down” in the event of an eminent collision. It warns the driver before shutting down if the driver fails to hold the steering wheel during normal driving conditions.

          The warnings were worthless because the driver kept responding to them just before they timed out and shut autopilot down. It would be even worse if the car immediately pulled off the road and stopped in traffic without warning the driver first.

          They aren’t subtle either, after failing to touch the wheel for about 5-10 seconds it starts beeping loudly and flashing an icon on the screen.

          This is not a case of autopilot causing an accident, this is a case of an impaired driver operating a vehicle when they should not have been. If the driver was using standard cruise control, would we be blaming the vehicle because their foot wasn’t touching the accelerator when the accident happened? No, we wouldn’t.

            • @Iheardyoubutsowhat@lemmy.world
              link
              fedilink
              English
              111 months ago

              The driver was in autopilot. Auto pilot is cruise control and lane assist. It’s not FSD. Tesla didnt bring that " to the road ". The driver was drunk, and with most auto pilot or FSD accidents…its user error.

              Still unaware of a proven FSD accident.

  • N3Cr0
    link
    fedilink
    English
    3111 months ago

    Poor drunk impaired driver falling victim to autonomous driving… Hopefully that driver lost their license.

    • Cyber Yuki
      link
      fedilink
      English
      711 months ago

      That doesn’t drive the problem of autopilot not taking the right choices. What is the driver wasn’t drunk, but they had a heart attack? What if someone put a roofie on their drink? What if the driver was diabetic or hypoglycemic and suffered a blood glucose fall? What if they had a stroke?

      Furthermore, what if the driver got drunk BECAUSE the car’s AI was advertised as being able to drive for you? Think of false publicity.

      If your AI can’t handle one simple case of a driver being unresponsive, that’s negligence on the company’s part.

      • @CmdrShepard@lemmy.one
        link
        fedilink
        English
        111 months ago

        How could the company be negligent if someone gets drunk or has a heart attack and crashes their car? No company has a Level 5 autonomous vehicle where no human intervention is needed. Tesla is only Level 2. Mercedes has a Level 3 option (in extremely limited conditions). Waymo claims Level 4 but is geofenced.

  • @Snapz@lemmy.world
    link
    fedilink
    English
    2811 months ago

    This source keeps pushing tesla propaganda. There’s always an angle trying to sell that it wasn’t the tesla’s fault

  • @EndOfLine@lemmy.world
    link
    fedilink
    English
    25
    edit-2
    11 months ago

    Officers injured at the scene are blaming and suing Tesla over the incident.

    And the reality is that any vehicle on cruise control with an impaired driver behind the wheel would’ve likely hit the police car at a higher speed. Autopilot might be maligned for its name but drivers are ultimately responsible for the way they choose to pilot any car, including a Tesla.

    I hope those officers got one of those “you don’t pay if we don’t win” lawyers. The responsibility ultimately resides with the driver and I’m not seeing them getting any money from Tesla.

    • @friendlymessage@feddit.de
      link
      fedilink
      English
      411 months ago

      Well, in the end it’s up to whether Tesla’s ADAS is compliant with laws and regulations. If there really were 150 warnings by the ADAS without it disengaging, this might be an indicator of faulty software and therefore Tesla being at least partially at fault. It goes without saying that the driver is mostly to blame but an ADAS shouldn’t just keep on driving when it senses that the driver is incapacitated.

      • @EndOfLine@lemmy.world
        link
        fedilink
        English
        111 months ago

        Also from the article:

        Data from the Autopilot system shows that it recognized the stopped car 37 yards or 2.5 seconds before the crash. Autopilot also slows the car down before disengaging altogether.

  • @redcalcium@lemmy.institute
    link
    fedilink
    English
    25
    edit-2
    11 months ago

    Data from the Autopilot system shows that it recognized the stopped car 37 yards or 2.5 seconds before the crash.

    Is the video slowed down? In the video, if you pause 2.5s before the crash, the stopped police car seems to be very close already. A (awake) human driver would’ve recognized the stopped police car from way more distance than that.

    • @Thetimefarm@lemm.ee
      cake
      link
      fedilink
      English
      1011 months ago

      I find that it can be hard to tell when a car ahead is stopped, maybe the visual system on the tesla has similar limitations. I think autopilot is controlled by the cameras alone but I’m not super up to date on tesla stuff. I would assume even a basic radar set up could tell something was stationary from quite far away.