• redcalcium@lemmy.institute
    link
    fedilink
    English
    arrow-up
    64
    ·
    11 months ago

    Tesla fucking up traditional driving controls only make sense if their self-driving system is working so the driver has no need to touch the steering wheel except in rare case. How good is Tesla’s full self driving these days?

    • Voroxpete@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      24
      ·
      11 months ago

      It regularly kills people. It can’t be used on a lot of road types (but people still do because Tesla makes no effort to prevent it). It’s still marketed as Full Self Driving despite the fact that Tesla has stated on the record that it is, and I quote, “Not capable of driving itself.”

      They’re trying to have their cake and eat it too. Any time it benefits them, they claim that their cars are completely autonomous vehicles powered by the most advanced AI. Any time they get their wrists slapped, they claim that it’s an assistive feature like cruise control that cannot and will not ever replace the human behind the wheel.

          • psud@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            edit-2
            11 months ago

            That’s all the people who were asleep on the highway or driving at very high speed in town

            The recent versions don’t allow either of those behaviours now, so those crashes aren’t happening anymore.

            Full self driving doesn’t do that

            And the deaths I’m interested in are these ones being caused by FSD, not lane keeping and cruise control. Loads of brands do lane keeping and cruise control and implement it no better than Tesla

            • NotMyOldRedditName@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              11 months ago

              Just keep in mind that FSD is only as safe as they claim because it’s supervised.

              I would hope that even a reasonably working system would be better with a human vigilantly watching it than a human driving regularly.

              The system would have to be really bad to be worse than that.

            • Zink@programming.dev
              link
              fedilink
              English
              arrow-up
              2
              ·
              11 months ago

              But does FSD change the logic for the lane keeping and the speed & distance?

              Aren’t one of the features “navigate on autopilot?”

              • psud@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                11 months ago

                It is quite different. Navigate on autopilot is lane keeping, cruise control, and automatic highway exits. FSD tries to do all driving tasks - turns at stop signs, at lights, keeping to the correct side on roads with no centre line, negotiating with oncoming traffic on narrow roads…

                • Zink@programming.dev
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  11 months ago

                  Yeah it adds more capabilities for sure. But if you are on a moderate to high speed road where autopilot works fine, then is the control logic any different?

                  Obviously there are various tours of accidents that autopilot would never get the chance to cause, like maybe turning right at an intersection and hitting a pedestrian. But do they act differently on a main road where teslas have done things like run into tractor trailers?

                  • psud@lemmy.world
                    link
                    fedilink
                    English
                    arrow-up
                    2
                    ·
                    edit-2
                    11 months ago

                    The one that hit a tractor trailer was years ago. They are far better now, specifically they see low contrast stuff now and that’s on autopilot. The biggest difference to the user will be the ability to have hands off the controls.

                    It isn’t the same though. FSD is written completely differently to autopilot. It’s a different program.

                    Other accidents it won’t have on those roads include falling asleep and running off the road, or being surprised by someone braking ahead and running into them

                    I’m sure it will be worse than humans around animals on the road. I wonder if it will see a wombat before it hits it.

        • Voroxpete@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          11 months ago

          I don’t need to provide you with evidence that FSD has caused crashes. There’s plenty; if you can’t find it you’re not looking.

          As to your point about accident statistics, that’s responding to a different point than the one I was making. I didn’t say that it kills people more often than they kill themselves (through dangerous, inattentive or reckless driving). I just said that it regularly kills people. There’s potentially some hyperbole there, you can quibble over definitions of “regularly” if you want to be a pendant, I really don’t care.

          The point is that when it does go wrong, it often goes spectacularly wrong, such as this case where a Tesla plowed into a truck or this thankfully low speed example of a very confused Tesla driving into oncoming traffic.

          Could a human make these errors? Absolutely. But would you, as a human, want to trust yourself to a vehicle that is capable of making these kinds of errors? Are you happy with the idea of possibly dying because the machine you’re in made one critical error? Perhaps an error that you yourself would not have made under the same circumstances?

          A lot of people will answer “yes” to that, but for me personally any autopilot that requires constant supervision to make sure it doesn’t kill me is more of a negative than a positive. Even if you try to pay attention, automation blindness will inevitably kick in. And really what is even the point of self driving if you have to be paying attention? If it’s not freeing you up to focus on other things then it might as well not be there at all.

      • redcalcium@lemmy.institute
        link
        fedilink
        English
        arrow-up
        4
        ·
        11 months ago

        No, I’m actually interested to know. Are most Tesla owners activate self driving during their daily commute? Tesla doesn’t sell their vehicle here so the only times I actually see a Tesla are in car shows.

        • corsicanguppy@lemmy.ca
          link
          fedilink
          English
          arrow-up
          4
          ·
          11 months ago

          We’ve had news stories - and a friend’s coworker too - of people sleeping on the highway portion of their commute. The friend’s coworker did it daily for months, setting an alarm when it was probably going to be ‘street’ driving time so he’d wake up and be ready.

          • Critical_Insight@feddit.uk
            link
            fedilink
            English
            arrow-up
            7
            ·
            11 months ago

            The friend’s coworker did it daily for months

            That’s both extremely stupid and irresponsible but also quite impressive on Tesla’s part.

          • redcalcium@lemmy.institute
            link
            fedilink
            English
            arrow-up
            1
            ·
            11 months ago

            Being able to sleep (or not paying any attention to the road) is the entire reason I would get a self driving car (assuming it’s safe to do so). But aren’t you required to keep your hands on staying wheel when engaging full self driving? And I think the car has camera to monitor driver attentiveness too. Can you really fall asleep during commute like that?

      • Aasikki@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        4
        ·
        11 months ago

        They say it’s beta but beta would imply that it’s at least somewhat close to ready, which it clearly isn’t even after being in “beta” for a long ass time.

    • AA5B@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      edit-2
      11 months ago

      Even if it were ready, what proportion of buyers spend the extra $12k to get self-driving?

      • NotMyOldRedditName@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        11 months ago

        If FSD was truly autonomous, or an excellent level 2 system?

        Truly autonomous, at 12k, it would have unlimited demand. Production would be the only restraint.

        Edit: Tesla might even prioritize sales with FSD or only make FSD cars at that point and rake in the profits.

      • psud@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        11 months ago

        About 1 in 5, though recent changes to price and the widening of the full self driving beta will have changed that since the stats were released in 2022

    • psud@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      11 months ago

      Tesla say it crashes enough to deploy an airbag about one fifth as often as human drivers (once per 3,200,000 miles versus once per 600,000)

      So safer than the average driver, presumably less safe than a safe driver

      • jj4211@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        11 months ago

        Be wary of cherry picked data.

        The average human driver has a car that’s five years older than the oldest model 3. This means five years more age on various safety equipment, five years more primitive collision avoidance systems, cars without stability control, etc.

        The autopilot system only engages in ideal circumstances. Poor visibility, poorly marked road, bad weather, all scenarios that are high risk that autopilot wont touch that also cause a lot of human accidents.

        • psud@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          11 months ago

          I’m talking full self driving beta, not autopilot. FSD works on bad roads, car parks, any weather it can see in, including moderately heavy rain. It won’t work in heavy fog, but I won’t drive in that either. Autopilot has a long history of only working on highways which upped its safety, but also a history of working hands off and at any speed.

          Also note that the initial beta was only open to the safest, most responsible, drivers according to Tesla data (Tesla have a lot of data on their drivers, many opt in to sharing everything in the hope of hurrying better automation) so the cars were very well supervised

          I’m really hanging out for insurance data once this system is out of beta

          • jj4211@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            11 months ago

            Even with FSD, I don’t think we can be anywhere close to a comparable cohort.

            To expand on the safety equipment, I wager the average driver with their 12.5 year old car also doesn’t have regen braking. So while 99% of Teslas likely have near pristine brake systems due to age and regen braking, the average driver is more likely to experience “surprise, your brakes are out!”

            Also, particularly based on my time with rural folk with cars in the woods, I’m highly doubtful that no matter how aggressive FSD may be, it won’t be as daring as some dubious human operators in that “average” cohort.

            Also, I’d wonder how Tesla would treat an FSD deactivation by driver intervention. If a crash is unavoidable and imminent, I’d imagine an aware driver might manage to yank the wheel in time to deactivate, but still get in an airbag deploying crash.

            There’s also some potetntial slush around “accidents that activate airbags”. Different models have different sensitivies.

            But all this falls second to a primary concern: never trust what amounts to marketing data from any company compared to something like NHTSA data.

            Would be interesting if someone could do the legwork to manage “like for like” to tell safety due to: -General age of car in general -Regenerative braking versus standard -Stability control, collision avoidance, automatic braking and so forth -Like for like driving conditions -Data for Teslas including human operation, autopilot and FSD. Particularly if human operator, but FSD was on less than 10 seconds before impact.

            • psud@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              11 months ago

              “surprise, your brakes are out!”

              That really doesn’t happen from wear. Brakes only surprise fail on long descents where the driver doesn’t use engine braking. If brakes fail like that you have the hand brake/e-brake

              EVs of course use regen braking almost always in that situation - though they can’t when their battery is full - my car expects to arrive at the coast at 20% battery, at the top of the coastal mountain range it’s at 15%, but at the beach it has regenerated to 20%

              The rest I generally agree. We need better data, especially better data from someone other than Tesla.