• magnetosphere@fedia.io
    link
    fedilink
    arrow-up
    164
    ·
    edit-2
    5 months ago

    I’ve often wondered why the FTC allows it to be marketed as “Full Self-Driving”. That’s blatant false advertising.

    • reddig33@lemmy.world
      link
      fedilink
      English
      arrow-up
      80
      ·
      5 months ago

      As is “autopilot”. There’s no automatic pilot. You’re still expected to keep your hands on the wheel and your eyes on the road.

      • halcyoncmdr@lemmy.world
        link
        fedilink
        English
        arrow-up
        55
        ·
        5 months ago

        I am so sick and tired of this belief because it’s clear people have no idea what Autopilot on a plane actually does. They always seem to assume it flies the plane and the pilot doesn’t do anything apparently. Autopilot alone does not fly the damned plane by itself.

        “Autopilot” in a plane keeps the wings level at a set heading, altitude, and speed. It’s literally the same as cruise control with lane-centering, since there’s an altitude issue on a road.

        There are more advanced systems available on the market that can be installed on smaller planes and in use on larger jets that can do things like auto takeoff, auto land, following waypoints, etc. without pilot input, but basic plain old autopilot doesn’t do any of that.

        That expanded capability is similar to how things like “Enhanced Autopilot” on a Tesla can do extra things like change lanes, follow highway exits on a navigated route, etc. Or how “Full Self-Driving” is supposed to follow road signs and lights, etc. but those are additional functions, not part of “Autopilot” and differentiated with their own name.

        Autopilot, either on a plane or a Tesla, alone doesn’t do any of that extra shit. It is a very basic system.

        The average person misunderstanding what a word means doesn’t make it an incorrect name or description.

        • machinin@lemmy.world
          link
          fedilink
          English
          arrow-up
          33
          ·
          edit-2
          5 months ago

          I say let Tesla market it as Autopilot if they pass similar regulatory safety frameworks as aviation autopilot functions.

        • Captain Aggravated@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          23
          ·
          5 months ago

          Flight instructor here.

          I’ve seen autopilot systems that have basically every level of complexity you can imagine. A lot of Cessna 172s were equipped with a single axis autopilot that can only control the ailerons and can only maintain wings level. Others have control of the elevators and can do things like altitude hold, or ascend/descend at a given rate. More modern ones have control of all three axes and integration with the attitude instruments, and can do things like climb to an altitude and level off, turn to a heading and stop, or even something like fly a holding pattern over a fix. They still often don’t have any control over the power plant, and small aircraft typically cannot land themselves, but there are autopilots installed in piston singles that can fly an approach to minimums.

          And that’s what’s available on piston singles; airline pilots seldom fly the aircraft by hand anymore.

        • reddig33@lemmy.world
          link
          fedilink
          English
          arrow-up
          18
          ·
          5 months ago

          “But one reason that pilots will opt to turn the system on much sooner after taking off is if it’s stormy out or there is bad weather. During storms and heavy fog, pilots will often turn autopilot on as soon as possible.

          This is because the autopilot system can take over much of the flying while allowing the pilot to concentrate on other things, such as avoiding the storms as much as possible. Autopilot can also be extremely helpful when there is heavy fog and it’s difficult to see, since the system does not require eyesight like humans do.”

          Does that sound like something Tesla’s autopilot can do?

          https://www.skytough.com/post/when-do-pilots-turn-on-autopilot

          • FiskFisk33@startrek.website
            link
            fedilink
            English
            arrow-up
            11
            ·
            edit-2
            5 months ago

            At SkyTough, we pride ourselves on ensuring our readers get the best, most helpful content that they’ll find anywhere on the web. To make sure we do this, our own experience and expertise is combined with the input from others in the industry. This way, we can provide as accurate of information as possible. With input from experts and pilots from all over, you’ll get the complete picture on when pilots turn autopilot on while flying!

            This is GPT.

            After that intro I don’t trust a single word of what that site has to say.

            If the writer didn’t bother to write the text, i hope they don’t expect me to bother to read it.

            • tyler@programming.dev
              link
              fedilink
              English
              arrow-up
              1
              ·
              5 months ago

              Why in the world would you think that’s gpt? That’s not the normal style of gpt and it’s definitely the style of normal corporate sites.

          • Captain Aggravated@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            7
            ·
            5 months ago

            Flight instructor here. The flying and driving environments are quite different, and what you need an “autodriver” to do is a bit different from an “autopilot.”

            In a plane, you have to worry a lot more about your attitude, aka which way is up. This is the first thing we practice in flight school with 0-hour students, just flying straight ahead and keeping the airplane upright. This can be a challenge to do in low visibility environments such as in fog or clouds, or even at night in some circumstances, and your inner ears are compulsive liars the second you leave the ground, so you rely on your instruments when you can’t see, especially gyroscopic instruments such as an attitude indicator. This is largely what an autopilot takes over for from the human pilot, to relieve him of that constant low-level task to concentrate on other things.

            Cars don’t have to worry about this so much; for normal highway driving any situation other than “all four wheels in contact with the road” is likely an unrecoverable emergency.

            Navigation in a plane means keeping track of your position in 3D space relative to features on the Earth’s surface. What airspace are you in, what features on the ground are you flying over, where is the airport, where’s that really tall TV tower that’s around here? Important for finding your way back to the airport, preventing flight into terrain or obstacles, and keeping out of legal trouble. This can be accomplished with a variety of ways, many of which can integrate with an autopilot. Modern glass cockpit systems with fully integrated avionics can automate the navigation process as well, you can program in a course and the airplane can fly that course by itself, if appropriately equipped.

            Navigation for cars is two separate problems; there’s the big picture question of “which road am I on? Do I take the next right? Where’s my exit?” which is a task that requires varying levels of precision from “you’re within this two mile stretch of road” to “you’re ten feet from the intersection.” And there’s the small picture question of “are we centered in the traffic lane?” which can have a required precision of inches. These are two different processes.

            Anticollision, aka not crashing into other planes, is largely a procedural thing. We have certain best practices such as “eastbound traffic under IFR rules fly on the odd thousands, westbound traffic flies on the even thousands” so that oncoming traffic should be a thousand feet above or below you, that sort of thing, plus established traffic patterns and other standard or published routes of flight for high traffic areas. Under VFR conditions, pilots are expected to see and avoid each other. Under IFR conditions, that’s what air traffic control is for, who use a variety of techniques to sequence traffic to make sure no one is in the same place at the same altitude at the same time, anything from carefully keeping track of who is where to using radar systems, and increasingly a thing called ADS-B. There are also systems such as TCAS which are aircraft carried traffic detection electronics. Airplanes are kept fairly far apart via careful sequencing. There’s also not all that much else up there, not many pedestrians or cyclists thousands of feet in the air, wildlife and such can be a hazard but mostly during the departure and arrival phases of flight while relatively low. This is largely a human task; autopilots don’t respond to air traffic control and many don’t integrate with TCAS or ADS-B, this is the pilot’s job.

            Cars are expected to whiz along mere inches apart via see and avoid. There is no equivalent to ATC on the roads, cars aren’t generally equipped with communication equipment beyond a couple blinking lights, and any kind of automated beacon for electronic detection absolutely is not the standard. Where roads cross at the same level some traffic control method such as traffic lights are used for some semblance of sequencing but in all conditions it requires visual see-and-avoid. Pedestrians, cyclists, wildlife and debris are constant collision threats during all phases of driving; deer bound across interstates all the time. This is very much a visual job, hell I’m not sure it could be done entirely with radar, it likely requires optical sensors/cameras. It’s also a lot more of the second-to-second workload of the driver. I honestly don’t see this task being fully automated with roads the way they are.

        • Turun@feddit.de
          link
          fedilink
          English
          arrow-up
          14
          ·
          5 months ago

          I’d wager most people, when talking about a plane’s autopilot mean the follow waypoints or Autoland capability.

          Also, it’s hard to argue “full self driving” means anything but the car is able to drive fully autonomously. If they were to market it as “advanced driver assist” I’d have no issue with it.

          • halcyoncmdr@lemmy.world
            link
            fedilink
            English
            arrow-up
            7
            ·
            5 months ago

            I’d wager most people, when talking about a plane’s autopilot mean the follow waypoints or Autoland capability.

            Many people are also pretty stupid when it comes to any sort of technology more complicated than a calculator. That doesn’t mean the world revolves around a complete lack of knowledge.

            My issue is just with people expecting basic Autopilot to do more than it’s designed or intended to do, and refusing to acknowledge their expectation might actually be wrong.

            Also, it’s hard to argue “full self driving” means anything but the car is able to drive fully autonomously. If they were to market it as “advanced driver assist” I’d have no issue with it.

            Definitely won’t get an argument from me there. FSD certainly isn’t in a state to really be called that yet. Although, to be fair, when signing up for it, and when activating it there are a lot of notices that it is in testing and will not operate as expected.

            At what point do we start actually expecting and enforcing that people be responsible with potentially dangerous things in daily life, instead of just blaming a company for not putting enough warnings or barriers to entry?

            • machinin@lemmy.world
              link
              fedilink
              English
              arrow-up
              8
              ·
              5 months ago

              At what point do we start actually expecting and enforcing that people be responsible with potentially dangerous things in daily life, instead of just blaming a company for not putting enough warnings or barriers to entry?

              Volvo seeks to have zero human deaths in their cars. Some places seek zero fatality driving environments. These are cultures where safety is front and center. Most FSD enthusiasts (see comments in the other threads below) cite safety as the main impetus for these systems. Hopefully we would see similar cultural values in Tesla.

              Unfortunately, Musk tweets out jokes when responding to a video of people having sex on autopilot. That is Tesla culture. Musk is responsible for putting these dangerous things in consumers hands and has created a culture where irresponsible and possibly fatal abuse of those things is something funny for everyone to laugh at. Of course, punish the individual users who go against the rules and abuse the systems. You also have to punish the company, and the idiot at the top, who holds those same rules in contempt.

            • Turun@feddit.de
              link
              fedilink
              English
              arrow-up
              7
              ·
              5 months ago

              Also, it’s hard to argue “full self driving” means anything but the car is able to drive fully autonomously. If they were to market it as “advanced driver assist” I’d have no issue with it.

              Definitely won’t get an argument from me there. FSD certainly isn’t in a state to really be called that yet. Although, to be fair, when signing up for it, and when activating it there are a lot of notices that it is in testing and will not operate as expected.

              At what point do we start actually expecting and enforcing that people be responsible with potentially dangerous things in daily life, instead of just blaming a company for not putting enough warnings or barriers to entry?

              Then the issue is simply what we perceive as the predominant marketing message. I know that in all legally binding material Tesla states what exactly the system is capable of and how alert the driver needs to be. But in my opinion that is vastly overshadowed by the advertising Tesla runs for their FSD capability. They show a 5 second message about how they are required by law to warn you about being alert at all times, before showing the car driving itself for 3 minutes, with the demo driver having the hands completely off the wheel.

        • Saik0@lemmy.saik0.com
          link
          fedilink
          English
          arrow-up
          5
          ·
          5 months ago

          “Autopilot” in a plane keeps the wings level at a set heading, altitude, and speed. It’s literally the same as cruise control with lane-centering, since there’s an altitude issue on a road.

          Factually incorrect. There are autopilot systems on planes now that can takeoff, fly, and land the flight on their own. So yes, “autopilot” is EXACTLY what people are assuming it to mean in many cases. Especially on planes that they would typically be accustom to… which is the big airliners.

          Now where you’re missing the point… There are varying degrees of autopilot. And that would be fine and dandy for Tesla’s case if you wish to invoke it. But considering the company has touted it to be the “most advanced” and “Full self driving” and “will be able to drive you from california to new york on it’s own”. They’ve set the expectation in that it is the most advanced autopilot. Akin to the plane that doesn’t actually need a pilot (although one is always present) for all three major parts of the flight. No tesla product comes even close to that claim, and I’m willing to bet they never do in their lifetime.

          • halcyoncmdr@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            5 months ago

            Now where you’re missing the point… There are varying degrees of autopilot. And that would be fine and dandy for Tesla’s case if you wish to invoke it. But considering the company has touted it to be the “most advanced” and “Full self driving” and “will be able to drive you from california to new york on it’s own”.

            I have said from the beginning that there are varying levels of Autopilot on planes and that needs to be taken into account when talking about the name and capabilities… that’s my entire argument you illiterate fool.

            You are, at best, failing to acknowledge, or more likely, willfully ignoring the fact that Tesla does differentiate these capabilities with differently named products. All while claiming that a plane Autopilot must inherently be the most advanced version on the market to be compared to Tesla’s most basic offering.

            You are adding in capabilities from the more advanced offerings that Tesla has, like Enhanced Autopilot, and Full Self Driving and saying those are part of “Autopilot”. If you want to compare basic Tesla Autopilot, then compare it to a basic plane Autopilot. Tesla doesn’t claim that basic “Autopilot” can do all the extra stuff, that’s why they have the other options.

            That’s the issue I have with these conversations, people are always comparing apples and oranges, and trying to claim that they’re not to try and justify their position.

            Tesla’s website does indicate these differences between the versions, and has as each added capability was added to the overall offerings.

            • Saik0@lemmy.saik0.com
              link
              fedilink
              English
              arrow-up
              2
              ·
              5 months ago

              You are, at best, failing to acknowledge

              No. That whole statement INCLUDING what you quoted was me allowing you to invoke it.

              Literally : “And that would be fine and dandy for Tesla’s case if you wish to invoke it.” Then I stated why that’s bad to invoke.

              You can claim I’m willfully ignorant. But you’re just a moron Elon shill.

              Tesla doesn’t claim that basic “Autopilot” can do all the extra stuff, that’s why they have the other options.

              And there’s why I’m just going to call you a moron Elon shill and move on. You’re full of shit. All they do is claim that it’s amazing/perfect. Then you buy the car and you expect the function and it doesn’t do it, not even close.

              • halcyoncmdr@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                5 months ago

                But you’re just a moron Elon shill.

                Ah yes, the classic internet response of calling anyone you disagree with a shill. Because clearly someone disagreeing with you and pointing out issues with claims means they must inherently be defending a company without any valid claims. Easy to ignore when you don’t consider them a real person having a discussion.

                No point in arguing with someone unwilling to have an actual discussion and just resorting to calling someone a shill because they refuse to accept a different point of view can even exist.

                “You’re a shill, so nothing you say matters”.

                • Saik0@lemmy.saik0.com
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  5 months ago

                  When you outright lie about the facts it’s hard to have any other opinion about you. So yes, you’re a shill

    • Catoblepas
      link
      fedilink
      English
      arrow-up
      32
      ·
      5 months ago

      It’s not even the closest thing to self driving on the market, Mercedes has started selling a car that doesn’t require you to look at the road.

        • machinin@lemmy.world
          link
          fedilink
          English
          arrow-up
          28
          ·
          5 months ago

          But it works and it’s hands off. Tesla can’t even legally do that under any condition.

          And fuck you if you ask Tesla to pay for any mistakes their software might make. It is ALWAYS your fault.

        • Catoblepas
          link
          fedilink
          English
          arrow-up
          27
          ·
          5 months ago

          So, greater than any speed on a Tesla and available in more states?

        • conciselyverbose@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          13
          ·
          5 months ago

          Because they’re doing shit responsibly.

          For the target audience they chose that thing is a fucking bargain. Do you know how many people making damn good money sit in hours of 4 lane bumper to bumper traffic every day? “You don’t have to drive and we assume liability if our system fucks up” is a massive value add.

          (Not enough that I’d ever consider dealing with that kind of commute no matter what you paid me. But still.)

        • spamspeicher@feddit.de
          link
          fedilink
          English
          arrow-up
          10
          ·
          5 months ago

          Level 3 in the S-Class and EQS has been available since may 2022. And the speed limit is there because that is part of a UN regulation that the Mercedes is certified for. The regulation has been updated since the release of Mercedes Drive Pilot to allow speeds up to 140km/h but Mercedes needs to recertify for that.

        • suction@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          ·
          5 months ago

          Still the most advanced system that is legal to use on public roads, worldwide. Tesla’s most advanced system is many leagues below that, so not sure why it’s so hard to believe for some people that Tesla is nothing but an also-ran.

  • tearsintherain@leminal.space
    link
    fedilink
    English
    arrow-up
    78
    ·
    5 months ago

    Move fast, break shit. Fake it till you sell it, then move the goal posts down. Shift human casualties onto individual responsibility, a core libertarian theme. Profit off the lies because it’s too late, money already in the bank.

  • TypicalHog@lemm.ee
    link
    fedilink
    English
    arrow-up
    46
    ·
    5 months ago

    It only matters if the autopilot does more kills than an average human driver on the same distance traveled.

    • NIB@lemmy.world
      link
      fedilink
      English
      arrow-up
      58
      ·
      5 months ago

      If the cars run over people while going 30kmh because they use cameras and a bug crashed into the camera and that caused the car to go crazy, that is not acceptable, even if the cars crash “less than humans”.

      Self driving needs to be highly regulated by law and demand to have some bare minimum sensors, including radars, lidars, etc. Camera only self driving is beyond stupid. Cameras cant see in snow or dark or whatever. Anyone who has a phone knows how fucky the camera can get under specific light exposures, etc.

      Noone but tesla is doing camera only “self driving” and they are only doing it in order to cut down the cost. Their older cars had more sensors than their newer cars. But Musk is living in his Bioshock uber capitalistic dream. Who cares if a few people die in the process of developing visual based self driving.

      https://www.youtube.com/watch?v=Gm2x6CVIXiE

            • AdrianTheFrog@lemmy.world
              link
              fedilink
              English
              arrow-up
              11
              ·
              5 months ago

              Because that’s expensive and can be done with a camera.

              Expensive, as in probably less than $600? Compared to the $35000 cost of a tesla?

              (comparing the cost of the iPhone 12 (without lidar) and iPhone 12 pro (with lidar), we can guess that the sensor probably costs less than $200, so 3 of them (for left, right, and front) would cost probably less than $600)

              lidar can actually be very cheap and small. Unfortunately, Apple bought the only company that seems to make sensors like that (besides some other super high end models)

              There have been a lot of promising research papers on the technology lately though, so I expect more, higher resolution and cheaper lidar sensors to be available relatively soon (next couple years probably).

              • Grippler@feddit.dk
                link
                fedilink
                English
                arrow-up
                2
                ·
                edit-2
                5 months ago

                Yeah that’s not even remotely the same type of sensor used in robotics and autonomous cars. Yes lidar is getting cheaper, but for high detail long range detection they’re much more expensive than the case of your iphone example. The iPhone “lidar” is less than useless in an automotive context.

                • BURN@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  ·
                  5 months ago

                  To get the same resolution and quality of image in all lighting scenarios, cameras are actually going to be more expensive than LiDAR. Cameras suffer in low light, low contrast situations due to the physical limitations of bending light. More light = bigger lenses = higher cost, when LiDAR works better and is cheaper

            • Zink@programming.dev
              link
              fedilink
              English
              arrow-up
              6
              ·
              5 months ago

              My eyes are decent, but if I had a sixth sense that gave me full accurate 3D 360 spatial awareness regardless of visibility, I would probably not turn it off just to use my eyes. I’d use both.

        • mojofrododojo@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          ·
          5 months ago

          Camera only should obviously be the endgame goal for all robots.

          I can’t tell if you’re a moron or attempting sarcasm but this is the least informed opinion I’ve seen in ages.

        • howrar@lemmy.ca
          link
          fedilink
          English
          arrow-up
          6
          ·
          5 months ago

          I’ve heard Elon Musk (or was it Karpathy?) talking about how camera should be sufficient for all scenarios because humans can do it on vision alone, but that’s poor reasoning IMO. Cars are not humans, so there’s no reason to confine them to the same limitations. If we want them to be safer and more capable than human drivers, one way to do that is by providing them with more information.

          • kingthrillgore@lemmy.ml
            link
            fedilink
            English
            arrow-up
            5
            ·
            5 months ago

            We built things like Lidars and ultrasound because we want better than our eyes at depth and sight.

    • Geobloke@lemm.ee
      link
      fedilink
      English
      arrow-up
      28
      ·
      5 months ago

      No it doesn’t. Every life stolen matters and if it could be found that if tesla could have replicated industry best practice and saved more lives so that they could sell more cars then that is on them

    • PresidentCamacho@lemm.ee
      link
      fedilink
      English
      arrow-up
      26
      ·
      5 months ago

      This is the actual logical way to think about self driving cars. Stop down voting him because “Tesla bad” you fuckin goons.

      • gallopingsnail@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        18
        ·
        5 months ago

        Tesla’s self driving appears to be less safe and causes more accidents than their competitors.

        “NHTSA’s Office of Defects Investigation said in documents released Friday that it completed “an extensive body of work” which turned up evidence that “Tesla’s weak driver engagement system was not appropriate for Autopilot’s permissive operating capabilities.”

        Tesla bad.

        • TypicalHog@lemm.ee
          link
          fedilink
          English
          arrow-up
          8
          ·
          5 months ago

          Can you link me the data that says Tesla’s competitors self-driving is more safe and causes less accidents and WHICH ONES? I would really like to know who else has this level of self-driving while also having less accidents.

        • nxdefiant@startrek.website
          link
          fedilink
          English
          arrow-up
          4
          ·
          5 months ago

          No one else has the same capability in as wide a geographic range. Waymo, Cruise, Blue Cruise, Mercedes, etc are all geolocked to certain areas or certain stretches of road.

          • GiveMemes@jlai.lu
            link
            fedilink
            English
            arrow-up
            5
            ·
            5 months ago

            Ok? Nobody else is being as wildly irresponsible, therefore tesla should be… rewarded?

            • nxdefiant@startrek.website
              link
              fedilink
              English
              arrow-up
              2
              ·
              5 months ago

              I’m saying larger sample size == larger numbers.

              Tesla announced 300 million miles on FSD v12 in just the last month.

              https://www.notateslaapp.com/news/2001/tesla-on-fsd-close-to-license-deal-with-major-automaker-announces-miles-driven-on-fsd-v12

              Geographically, that’s all over the U.S, not just in hyper specific metro areas or stretches of road.

              The sample size is orders of magnitude bigger than everyone else, by almost every metric.

              If you include the most basic autopilot, Tesla surpassed 1 billion miles in 2018.

              These are not opinions, just facts. Take them into account when you decide to interpret the opinion of others.

              • GiveMemes@jlai.lu
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                5 months ago

                That’s not how rates work tho. Larger sample size doesn’t correlate with a higher rate of accidents, which is what any such study implies, not just raw numbers. Your bullshit rationalization is funny. In fact, a larger sample size tends to correspond with lower rates of flaws, as there is less chance that an error/fault makes an outsized impact on the data.

                • nxdefiant@startrek.website
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  edit-2
                  5 months ago

                  No one’s talking about rates. The article itself, all the articles linked in these comments are talking about counts. Numbers of incidents. I’m not justifying anything because I’m not injecting my opinion here. I’m only pointing out that without context, counts don’t give you enough information to draw a conclusion, that’s just math. You can’t even derive a rate without that context!

        • Socsa@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          3
          ·
          5 months ago

          I don’t quite understand what they mean by this. It tracks drivers with a camera and the steering wheel sensor and literally turns itself off if you stop paying attention. What more can they do?

          • nxdefiant@startrek.website
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            5 months ago

            The NHSTA hasn’t issued rules for these things either.

            the U.S. gov has issued general guidelines for the technology/industry here:

            https://www.transportation.gov/av/4

            They have an article on it discussing levels of automation here:

            https://www.nhtsa.gov/vehicle-safety/automated-vehicles-safety

            By all definitions layed out in that article:

            BlueCruise, Super Cruise, Mercedes’ thing is a lvl3 system ( you must be alert to reengage when the conditions for their operation no longer apply )

            Tesla’s FSD is a lvl 3 system (the system will warn you when you must reengage for any reason)

            Waymo and Cruise are a lvl 4 system (geolocked)

            Lvl 5 systems don’t exist.

            What we don’t have is any kind of federal laws:

            https://www.ncsl.org/transportation/autonomous-vehicles

            Separated into two sections – voluntary guidance and technical assistance to states – the new guidance focuses on SAE international levels of automation 3-5, clarifies that entities do not need to wait to test or deploy their ADS, revises design elements from the safety self-assessment, aligns federal guidance with the latest developments and terminology, and clarifies the role of federal and state governments.

            The guidance reinforces the voluntary nature of the guidelines and does not come with a compliance requirement or enforcement mechanism.

            (emphasis mine)

            The U.S. has operated on a “states are laboratories for laws” principal since its founding. The current situation is in line with that principle.

            These are not my opinions, these are all facts.

        • PresidentCamacho@lemm.ee
          link
          fedilink
          English
          arrow-up
          4
          ·
          edit-2
          5 months ago

          My argument is that self driving car fatalities have to be compared against human driven car fatalities. If the self driving cars kill 500 people a year, but humans kill 1000 people a year, which one is better. Logic clearly isn’t your strong suit, maybe sit this one out…

      • doubtingtammy@lemmy.ml
        link
        fedilink
        English
        arrow-up
        12
        ·
        5 months ago

        It’s not logical, it’s ideological. It’s the ideology that allows corporations to run a dangerous experiment on the public without their consent.

        And where’s the LIDAR again?

    • mojofrododojo@lemmy.world
      link
      fedilink
      English
      arrow-up
      23
      ·
      5 months ago

      this is bullshit.

      A human can be held accountable for their failure, bet you a fucking emerald mine Musk won’t be held accountable for these and all the other fool self drive fuckups.

      • sabin@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        5 months ago

        So you’d rather live in a world where people die more often, just so you can punish the people who do the killing?

        • mojofrododojo@lemmy.world
          link
          fedilink
          English
          arrow-up
          12
          ·
          5 months ago

          That’s a terrifically misguided interpretation of what I said, wow.

          LISTEN UP BRIGHT LIGHTS, ACCOUNTABILITY ISN’T A LUXURY. It’s not some ‘nice to have add-on’.

          Musk’s gonna find out. Gonna break all his fanboys’ hearts too.

          • sabin@lemmy.world
            link
            fedilink
            English
            arrow-up
            7
            ·
            5 months ago

            Nothing was misguided and if anything your tone deaf attempt to double down only proves the point I’m making.

            This stopped being about human deaths for you a long time ago.

            Let’s not even bother to ask the question of whether or not this guy could ultimately be saving lives. All that matters to you is that you have a target to take your anger out on the event that a loved one dies in an accident or something.

            You are shallow beyond belief.

            • mojofrododojo@lemmy.world
              link
              fedilink
              English
              arrow-up
              7
              ·
              5 months ago

              This stopped being about human deaths for you a long time ago.

              Nope, it’s about accountability. The fact that you can’t see how important accountability is just says you’re a musk fan boy. If Musk would shut the fuck up and do the work, he’d be better off - instead he’s cheaping out left and right on literal life dependent tech, so tesla’s stock gets a bump. It’s ridiculous, like your entire argument.

              • sabin@lemmy.world
                link
                fedilink
                English
                arrow-up
                3
                ·
                5 months ago

                I don’t give a fuck about musk. I think hos Hyperloop is beyond idiotic and nothing he makes fucking works. In fact I never even said I necessarily think the state of Tesla autopilot is acceptable. All I said was that categorically rejecting autopilot (even for future generations where tech can be much better) for the express purpose of being able to prosecute people is beyond empty and shallow.

                If you need to make up lies about me and strawman me to disagree you only prove my point. You stopped being a rational agent who weighs the good and bad of things a long time ago. You don’t care about how good the autopilot is or can be. All you care about is your mental fixation against the CEO of the company in question.

                Your political opinions should be based on principles, not whatever feels convenient in the moment.

                • mojofrododojo@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  5 months ago

                  You stopped being a rational agent who weighs the good and bad of things a long time ago.

                  sure thing, you stan musk for no reason, and call me irrational. pfft. gonna block you now, tired of your bullshit

      • TypicalHog@lemm.ee
        link
        fedilink
        English
        arrow-up
        3
        ·
        5 months ago

        Where did I say that a human shouldn’t be held accountable for what their car does?

  • kava@lemmy.world
    link
    fedilink
    English
    arrow-up
    42
    ·
    5 months ago

    Is the investigation exhaustive? If these are all the crashes they could find related to the driver assist / self driving features, then it is probably much safer than a human driver. 1000 crashes out of 5M+ Teslas sold the last 5 years is actually a very small amount

    I would want an article to try and find the rate of accidents per 100,00, group it by severity, and then compare and contrast that with human caused accidents.

    Because while it’s clear by now Teslas aren’t the perfect self driving machines we were promised, there is no doubt at all that humans are bad drivers.

    We lose over 40k people a year to car accidents. And fatal car accidents are rare, so multiple that by like 100 to get the total number of car accidents.

    • Blackmist@feddit.uk
      link
      fedilink
      English
      arrow-up
      46
      ·
      5 months ago

      The question isn’t “are they safer than the average human driver?”

      The question is “who goes to prison when that self driving car has an oopsie, veers across three lanes of traffic and wipes out a family of four?”

      Because if the answer is “nobody”, they shouldn’t be on the road. There’s zero accountability, and because it’s all wibbly-wobbly AI bullshit, there’s no way to prove that the issues are actually fixed.

        • Blackmist@feddit.uk
          link
          fedilink
          English
          arrow-up
          22
          ·
          5 months ago

          Accountability is important. If a human driver is dangerous, they get taken off the roads and/or sent to jail. If a self driving car kills somebody, it’s just “oops, oh well, these things happen, but shareholder make a lot of money so never mind”.

          I do not want “these things happen” on my headstone.

          • Tja@programming.dev
            link
            fedilink
            English
            arrow-up
            9
            ·
            5 months ago

            So you would prefer to have higher chances of dying, just to write “Joe Smith did it” on it?

          • ipkpjersi@lemmy.ml
            link
            fedilink
            English
            arrow-up
            8
            ·
            edit-2
            5 months ago

            But if a human driver is dangerous, and gets put in jail or get taken off the roads, there are likely already more dangerous human drivers taking their place. Not to mention, genuine accidents, even horrific ones, do happen with human drivers. If the rate of accidents and rate of fatal accidents with self-driving vehicles is way down versus human drivers, you are actually risking your life more by trusting in human drivers and taking way more risks that way. Having someone be accountable for your death doesn’t matter if you’ve already died because of them.

            Is it any better if you have “Killed by Bill Johnson’s SUV” on your headstone?

      • ipkpjersi@lemmy.ml
        link
        fedilink
        English
        arrow-up
        10
        ·
        edit-2
        5 months ago

        The question isn’t “are they safer than the average human driver?”

        How is that not the question? That absolutely is the question. Just because someone is accountable for your death doesn’t mean you aren’t already dead, it doesn’t bring you back to life. If a human driver is actively dangerous and get taken off the road or put in jail, there are very likely already plenty more taking that human drivers place. Plus genuine accidents, even horrific ones, do happen with human drivers. If the death rate for self-driving vehicles is really that much lower, you are risking your life that much more by trusting in human drivers.

        • ShepherdPie@midwest.social
          link
          fedilink
          English
          arrow-up
          7
          ·
          5 months ago

          Yeah that person’s take seems a little unhinged as throwing people in prison after a car accident only happens if they’re intoxicated or driving recklessly. These systems don’t have to be perfect to save lives. They just have to be better than the average driver.

          • Tja@programming.dev
            link
            fedilink
            English
            arrow-up
            3
            ·
            5 months ago

            Hell, let’s put the threshold at “better than 99% of drivers”, because every driver I know thinks they are better than average.

        • sugar_in_your_tea@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          5
          ·
          5 months ago

          Exactly.

          We should solve the accountability problem, but the metric should be lives and accidents. If the self-driving system proves it causes fewer accidents and kills fewer people, it should be preferred. Full stop.

          Throwing someone in jail may be cathartic, but the goal is fewer issues on the road, not more people in jail.

      • kava@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        ·
        5 months ago

        Because if the answer is “nobody”, they shouldn’t be on the road

        Do you understand how absurd this is? Let’s say AI driving results in 50% less deaths. That’s 20,000 people every year that isn’t going to die.

        And you reject that for what? Accountability? You said in another comment that you don’t want “shit happens sometimes” on your headstone.

        You do realize that’s exactly what’s going on the headstones of those 40,000 people that die annually right now? Car accidents happen. We all know they happen and we accept them as a necessary evil. “Shit happens”

        By not changing it, ironically, you’re advocating for exactly what you claim you’re against.

      • Maddier1993@programming.dev
        link
        fedilink
        English
        arrow-up
        5
        ·
        5 months ago

        I don’t agree with your argument.

        Making a human go to prison for wiping out a family of 4 isn’t going to bring back the family of 4. So you’re just using deterrence to hopefully make drivers more cautious.

        Yet, year after year… humans cause more deaths by negligence than tools can cause by failing.

        The question is definitely “How much safer are they compared to human drivers”

        It’s also much easier to prove that the system has those issues fixed compared to training a human hoping that their critical faculties are intact. Rigorous Software testing and mechanical testing are within legislative reach and can be made strict requirements.

      • slumberlust@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        5 months ago

        The question for me is not what margins the feature is performing on, as they will likely be better than human error raters, but how they market the product irresponsiblely.

    • machinin@lemmy.world
      link
      fedilink
      English
      arrow-up
      20
      ·
      edit-2
      5 months ago

      I was looking up info for another comment and found this site. It’s from 2021, but the information seems solid.

      https://www.flyingpenguin.com/?p=35819

      This table was probably most interesting, unfortunately the formatting doesn’t work on mobile, but I think you can make sense of it.

      Car 2021 Sales So Far Total Deaths

      Tesla Model S 5,155 40

      Porsche Taycan 5,367 ZERO

      Tesla Model X 6,206 14

      Volkswagen ID 6,230 ZERO

      Audi e-tron 6,884 ZERO

      Nissan Leaf 7,729 2

      Ford Mustang Mach-e 12,975 ZERO

      Chevrolet Bolt 20,288 1

      Tesla Model 3 51,510 87

      So many cars with zero deaths compared to Tesla.

      It isn’t if Tesla’s FSD is safer than humans, it’s if it’s keeping up with the automotive industry in terms of safety features. It seems like they are falling behind (despite what their marketing team claims).

        • petrol_sniff_king
          link
          fedilink
          English
          arrow-up
          3
          ·
          5 months ago

          If not, that would indicate that this newfangled self-driving is more dangerous than a little ol’ “caught in the stone-age” Nissan Leaf, wouldn’t it?

    • ipkpjersi@lemmy.ml
      link
      fedilink
      English
      arrow-up
      20
      ·
      edit-2
      5 months ago

      I know this is going to sound bad but bear with me and read my entire post. I think in this case it might be that people are trying to hate on Tesla because it’s Elon (and fair enough) rather than self-driving itself. Although there’s also the side of things that self-driving vehicles are already very likely safer than human-driven ones, have lower rates of accidents, etc but people expect there to be zero accidents whatsoever with self-driving which is why I think self-driving may never actually take off and become mainstream. Then again, there’s the lack of accountability, people prefer being able to place the blame and liability on something concrete, like an actual human. It’s possible I’m wrong but I don’t think I am wrong about this.

      edit: I looked further into this, and it seems I am partially wrong. It seems that Tesla is not keeping up with the average statistics in the automotive industry in terms of safety statistics, the self-driving in their vehicles seem less safe than their competitors.

      • NιƙƙιDιɱҽʂ@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        5 months ago

        For example, I dont really trust mine and mostly use it in slow bumper to bumper traffic, or so I can adjust my AC on the touchscreen without swerving around in my lane.

    • suction@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      edit-2
      5 months ago

      Only Elon calls his level 2 automation “FSD” or even “Autopilot”. That alone proves that Tesla is more guilty of these deaths than other makers are who choose less evil marketing terms. The dummies who buy Elon’s crap take those terms at face value and the Nazi CEO knows that, he doesn’t care though because just like Trump he thinks of his fans as little more than maggots. Can’t say I blame him.

  • axo@feddit.de
    link
    fedilink
    English
    arrow-up
    41
    ·
    5 months ago

    Accoring to the math in this video: :

    • 150 000 000 miles have been driven with Teslas “FSD”, which equals to
    • 375 miles per tesla purchased with FSD capabilities
    • 736 known FSD crashes with 17 fatalities
    • equals 11.3 deaths per 100M miles of teslas FSD

    Doesnt sound to bad, until you hear that a human produces 1.35 deaths per 100M miles driven…

    Its rough math, but holy moly that already is a completely other class of deadly than a non FSD car

    • dufkm@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      5 months ago

      a human produces 1.35 deaths per 100M miles driven

      My car has been driven around 100k miles by a human, i.e. it has produced 0.00135 deaths. Is that like a third of a pinky toe?

    • NotMyOldRedditName@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      5 months ago

      That number is like 1.5 billion now and rising exponentially fast.

      Also those deaths weren’t all FSD they were AP.

      The report says 1 FSD related (not caused by but related) death. For whatever reason the full details on that one weren’t released.

      Edit: There are billions of miles on AP. In 2020 it was 3 billion

      Edit: Got home and I tried finding AP numbers through 2024 but haven’t seen anything recent, but given 3 billion 2020, and 2 billion in 2019, and an accelerating rate of usage with increased car sales, 2023 is probably closer to 8 billion miles. I imagine we’d hear when they reach 10 billion.

      So 8 billion miles, 16 AP fatalities (because that 1 FSD one isn’t the same) is 1 fatality per 500,000,000 miles, or put into the terms above by per 100mil miles, 0.2 fatalities per 100 million miles or 6.75 times less than a human produces. And nearly all of these fatal accidents were from blatant misuse of the system like driving drunk (at least a few) or using their phone and playing games.

  • curiousPJ@lemmy.world
    link
    fedilink
    English
    arrow-up
    40
    ·
    5 months ago

    If Red Bull can be successfully sued for false advertising from their slogan “It gives you wings”, I think it stands that Tesla should too.

  • set_secret@lemmy.world
    link
    fedilink
    English
    arrow-up
    37
    ·
    5 months ago

    VERGE articles seem to be getting worse over the years, they’ve almost reached Forbes level, yes this does raise some valid safety concerns. No Tesla isn’t bad just because it’s Tesla.

    It doesn’t really give us the full picture. For starters, there’s no comparison with Level 2 systems from other car makers, which also require driver engagement and have their own methods to ensure attention. This would help us understand how Tesla’s tech actually measures up.

    Plus, the piece skips over extremely important stats that would give us a clearer idea of how safe (or not) Tesla’s systems are compared to good old human driving.

    We’re left in the dark about how Tesla compares in scenarios like drunk, distracted, or tired driving—common issues that automation aims to mitigate. (probably on purpose).

    It feels like the article is more about stirring up feelings against Tesla rather than diving deep into the data. A more genuine take would have included these comparisons and variables, giving us a broader view of what these technologies mean for road safety.

    I feel like any opportunity to jump on the Elon hate wagon is getting tiresome. (and yes i hate Elon too).

    • WormFood@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      ·
      5 months ago

      a more genuine take would have included a series of scenarios (e.g. drunk/distracted/tired driving)

      I agree. they did tesla dirty. a more fair comparison would’ve been between autopilot and a driver who was fully asleep. or maybe a driver who was dead?

      and why didn’t this news article contain a full scientific meta analysis of all self driving cars??? personally, when someone tells me that my car has an obvious fault, I ask them to produce detailed statistics on the failure rates of every comparable car model

      • mojofrododojo@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        5 months ago

        a driver who was fully asleep. or maybe a driver who was dead?

        why does it need to become a specious comparison for it to be valid in your expert opinion? because those comparisons are worthless.

    • PersnickityPenguin@lemm.ee
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      5 months ago

      A couple of my criticisms with the article, which is about “autopilot” and not fsd:

      -conflating autopilot and dad numbers, they are not interoperable systems. They are separate code bases with different functionality.

      -the definition of “autopilot” seems to have been lifted from the aviation industry. The term is used to describe a system that controls the vector of a vehicle, is the speed and direction. That’s all. This does seem like a correct description for what the autopilot system does. While “FSD” does seem like it does not live up to expectations, not being a true level 5 driving system.

      Merriam Webster defines autopilot thusly:

      “A device for automatically steering ships, aircraft, and spacecraft also : the automatic control provided by such a device”

  • nek0d3r@lemmy.world
    link
    fedilink
    English
    arrow-up
    31
    ·
    5 months ago

    I love to hate on musky boi as much as the next guy, but how does this actually compare to vehicular accidents and deaths overall? CGP Grey had the right idea when he said they didn’t need to be perfect, just as good as or better than humans.

    • machinin@lemmy.world
      link
      fedilink
      English
      arrow-up
      19
      ·
      edit-2
      5 months ago

      Grey had the right idea when he said they didn’t need to be perfect, just as good as or better than humans.

      The better question - is Tesla’s FSD causing drivers to have more accidents than other driving assist technologies? It seems like a yes from this article and other data I’ve linked elsewhere in this thread.

      • nek0d3r@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        5 months ago

        I appreciate this response amongst all the malding! My understanding of the difference in assistive technologies across different companies is lacking, so I’ll definitely look more into this.

    • JackbyDev@programming.dev
      link
      fedilink
      English
      arrow-up
      15
      ·
      5 months ago

      CGP Grey also seems to believe self driving cars with the absence of traffic lights is the solution to traffic as opposed to something like trains.

  • froh42@lemmy.world
    link
    fedilink
    English
    arrow-up
    30
    ·
    edit-2
    5 months ago

    “If you’ve got, at scale, a statistically significant amount of data that shows conclusively that the autonomous car has, let’s say, half the accident rate of a human-driven car, I think that’s difficult to ignore,” Musk said.

    That’s a very problematic claim - and it might only be true if you compare completely unassited vehicles to L2 Teslas.

    Other brands also have a plethora of L2 features, but they are marketed and designed in a different way. The L2 features are activate but designed in a way to keep the driver engaged in driving.

    So L2 features are for better safety, not for a “wow we live in the future” show effect.

    For example lane keeping in my car - you don’t notice it when driving, it is just below your level of attention. But when I’m unconcentrated for a moment the car just stays on the lane, even on curving roads. It’s just designed to steer a bit later than I would do. (Also, even before, the wheel turns minimally lighter into the direction to keep the car center of lane, than turning it to the other direction - it’s just below what you notice, however if you don’t concentrate on that effect)

    Adaptive speed control is just sold as adaptive speed control - it did notice it uses radar AND the cameras once, as it considers. my lane free as soon the car in front me clears the lane markings with its wheels (when changing lanes)

    It feels like the software in my car could do a lot more, but its features are undersold.

    The combination of a human driver and the driver assist systems in combination makes driving a lot safer than relying on the human or the machine alone.

    In fact the braking assistant has once stopped my car in tight traffic before I could even react, as the guy in front of me suddenly slammed their brakes. If the system had failed and not detected the situation then it would have been my job to react in time. (I did react, but can’t say if I might have been fast enough with reaction times)

    What Tesla does with technology is impressive, but I feel the system could be so. much better if they didn’t compromise saftey in the name of marketing and hyperbole.

    If Tesla’s Autopilot was designed frim ground up to keep the driver engaged, I believe it would really be the safest car on the road.

    I feel they are rather designed to be able to show off “cool stuff”.

    • ForgotAboutDre@lemmy.world
      link
      fedilink
      English
      arrow-up
      21
      ·
      5 months ago

      Tesla’s autopilot isn’t the best around. It’s just the most deployed and advertised. People creating autopilot responsibly don’t beta test them with the kind of idiots that think Tesla autopilot is the best approach.

    • dependencyinjection@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      6
      ·
      5 months ago

      When I see this comment it makes me wonder, how do you feel when you see someone driving a car?

      Should I feel guilty for owning a car. I’m 41 and I got my first car when I was 40, because I changed careers and it was 50 miles away.

      I rarely used it outside of work and it was a means to get me there. I now work remote 3 days so only drive 2.

      I don’t have social media or shop with companies like Amazon. I have just been to my first pro-Palestine protest.

      Am I to be judged for using a car?

      • hydration9806@lemmy.ml
        link
        fedilink
        English
        arrow-up
        8
        ·
        5 months ago

        I believe what they mean is “fuck car centric societal design”. No reasonable person should be mad that someone is using the current system to live their life (i.e. driving to work). What the real goal is spreading awareness that a car centric society is inherently isolating and stressful, and that one more lane does absolutely nothing to lessen traffic (except for like a month ish)

      • machinin@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        5 months ago

        Probably not you personally, but the system, oil companies, and people like Musk and his followers that want to prioritize private driving over public transportation.

        I say fuck cars, and I have one too. I try to avoid using it, but it’s easy to be lazy. I’m also fortunate to live someplace with great public transportation.

        Don’t take it personally, just realize life can be better if we could learn to live without these huge power-hungry cargo containers taking us everywhere.

      • PlexSheep@infosec.pub
        link
        fedilink
        English
        arrow-up
        2
        ·
        5 months ago

        That’s a good question!

        The short answer is no. Cars suck for many reasons, but it’s a fact in many parts of the world that you cannot be a functioning member of a society without one, especially if your government doesn’t get that cars suck or you live somewhere remote.

        How do I feel when I see someone driving a car? Mostly my feelings don’t change, because it is so normalized. But I get somewhat angry when I see uselessly huge cars that are obviously just a waste of resources. I have fun ridiculing car centric road and city design, but it’s the bad kind of fun.

        I am also very careful around cars, both while I’m in and outside of them. Cars are very heavy and drivers are infamous for being bad at controlling them. This isn’t their fault, it’s super easy to make mistakes while driving, you just have to move your feet a little too fast or move your hand a little too far and boom, someone is dead.

        Think about driving on a highway. If the guy next to you accidentally moves the wheel a little more than usual, that car will crash into you, creating a horrendous scene. It’s just too prone to failure, and failure will probably mean person damages. For this reason, cars are legitimately scaring me, even if I have to deal with it.

        Sorry if that does not make sense to you. I’m still trying to figure all this out for myself and I’m not always rational about these topics, because seeing the potential of our cities being wasted by car centric design makes me angry.

  • Numberone@startrek.website
    link
    fedilink
    English
    arrow-up
    21
    ·
    edit-2
    5 months ago

    Is linked to excess deaths? Technically it could be saving lives at a population scale. I doubt that’s the case, but it could be. I’ll read the article now and find out.

    Edit: it doesn’t seem to say anything regarding “normal” auto related deaths. They’re focusing on the bullshit designation of an unfinished product as “autopilot”,and a (small) subset of specific cases that are particularly aggregious, where there were 5-10 seconds of lead time into an incident. In these cases a person who was paying attention wouldn’t have been in the accident.

    Also some clarity edits.

    • letsgo@lemm.ee
      link
      fedilink
      English
      arrow-up
      3
      ·
      5 months ago

      OK.

      Question: how do you propose I get to work? It’s 15 miles, there are no trains, the buses are far too convoluted and take about 2 hours each way (no I’m not kidding), and “move house” is obviously going to take too long (“hey boss, some rando on the internet said “stop using cars” so do you mind if I take indefinite leave to sell my house and buy a closer one?”).

        • letsgo@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          5 months ago

          Sure, but the challenge was “Don’t use cars”, not “Don’t use cars where there is viable mass transit in place”.

        • letsgo@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          5 months ago

          I already have (Yamaha MT10), but presumably that has the same problem that cars do (burning fossil fuels); also it’s no good in shit weather (yeah I know that means I need better clothing).

  • Betide@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    ·
    5 months ago

    The same people who are upset over self driving cars are the ones who scream at the self checkout that they shouldn’t have to scan their own groceries because the store isn’t paying them.

    32% of all traffic crash fatalities in the United States involve drunk drivers.

    I can’t wait until the day that this kind of technology is required by law I’m tired of sharing the road with these idiots and I absolutely trust self driving vehicles more than I trust other humans.

    • kat_angstrom@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      5 months ago

      I’ve never heard of anyone screaming because they had to scan their own groceries at a self-checkout. Is this a common thing?

      • Fridgeratr@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        5
        ·
        5 months ago

        No. No idea what they’re talking about. If someone really feels that way, there are usually other aisles with people that can scan the groceries.

    • Flying Squid@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      5 months ago

      people who are upset over self driving cars

      If you are talking about Teslas, you can’t be upset about something a car doesn’t actually do unless you think it’s actually capable of doing it.

      The only thing I don’t like is that Tesla is able to claim it has a “full self driving” mode which is not full self driving. Seems like false advertising to me.

    • EvolvedTurtle@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      5 months ago

      I recently learned that at least half of the drivers where I live thing it’s fine to cut me off while we are going 70mph on the highway with no signal