• deafboy@lemmy.world
    link
    fedilink
    English
    arrow-up
    90
    ·
    7 months ago

    And they managed to do it without us obsessing about their CEO several times a day? I refuse to believe that!

  • cAUzapNEAGLb@lemmy.world
    link
    fedilink
    English
    arrow-up
    82
    ·
    edit-2
    7 months ago

    As of April 11, there were 65 Mercedes autonomous vehicles available for sale in California, Fortune has learned through an open records request submitted to the state’s DMV. One of those has since been sold, which marks the first sale of an autonomous Mercedes in California, according to the DMV. Mercedes would not confirm sales numbers. Select Mercedes dealerships in Nevada are also offering the cars with the new technology, known as “level 3” autonomous driving.

    Drivers can activate Mercedes’s technology, called Drive Pilot, when certain conditions are met, including in heavy traffic jams, during the daytime, on spec ific California and Nevada freeways, and when the car is traveling less than 40 mph. Drivers can focus on other activities until the vehicle alerts them to resume control. The technology does not work on roads that haven’t been pre-approved by Mercedes, including on freeways in other states.

    U.S. customers can buy a yearly subscription of Drive Pilot in 2024 EQS sedans and S-Class car models for $2,500.

    Mercedes is also working on developing level 4 capabilities. The automaker’s chief technology officer Markus Schäfer expects that level 4 autonomous technology will be available to consumers by 2030, Automotive News reported.

    • Ilovethebomb@lemm.ee
      link
      fedilink
      English
      arrow-up
      58
      ·
      7 months ago

      Hmm, so only on a very small number of predetermined routes, and at very slow speeds for those roads.

      Still impressive, but not as impressive as the headline makes out.

          • ours@lemmy.world
            link
            fedilink
            English
            arrow-up
            14
            ·
            7 months ago

            Having known one, some of their customers love their feature loaded cars to brag about and feel extra special. Some will definitely pay the 2.5k gladly.

        • Veraxus@lemmy.world
          link
          fedilink
          English
          arrow-up
          17
          ·
          edit-2
          7 months ago

          If they assume full liability for any collisions while the feature is active (and it looks like they do), then I can see that being fair.

      • Turun@feddit.de
        link
        fedilink
        English
        arrow-up
        8
        ·
        7 months ago

        Yes, but it’s actually level 3.

        Not the Tesla “full self driving - no wait we actually lied to you, you need to be alert at all times” bullshit.

    • jballs@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      32
      ·
      7 months ago

      I’ve seen this headline a few times and the details are laughably bad. The only reason this can be getting any press is because the headline is good clickbait. But 40 mph top speed on approved roads in 2 states only if a car is in front of you in the daytime is entirely useless. I guess it’s a good first step maybe? But trying to write headlines like this is big news is sad.

      • lemmyvore@feddit.nl
        link
        fedilink
        English
        arrow-up
        36
        ·
        7 months ago

        40 mph top speed on approved roads in 2 states only if a car is in front of you in the daytime is entirely useless.

        It’s specifically designed to navigate traffic congestion, which happens under 30 mph. It can keep up with the lane, deal with lane changes, honk if someone backs into you, let ambulances through, things like that. Not sure why the article presents it as generic driving.

      • Turun@feddit.de
        link
        fedilink
        English
        arrow-up
        7
        ·
        7 months ago

        The reason this gets attention is because it’s the first level 3 sold to consumers.

        The tech is hard, of course it’s gonna start out with laughingly limited capabilities. But it’s the first step towards more automation.

      • conciselyverbose@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        7 months ago

        It’s starting in California where there are a meaningful number of high earners who are spending hours per day in 4 lane bumper to bumper traffic.

        Having actual autonomy during those hours is still shit. But it’s a hell of a lot less shit than the tedium of the high attention requirements of sitting in traffic at a crawl.

  • eee@lemm.ee
    link
    fedilink
    English
    arrow-up
    71
    ·
    7 months ago

    U.S. customers can buy a yearly subscription of Drive Pilot in 2024 EQS sedans and S-Class car models for $2,500

    yeah, fuck that.

      • MeatsOfRage@lemmy.world
        link
        fedilink
        English
        arrow-up
        25
        ·
        7 months ago

        They’re also accepting full liability if anything happens while using this feature so it’s actually a type of insurance

        • Corkyskog@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          4
          ·
          7 months ago

          I wonder how much cheaper it will make auto insurance. I also wonder if this will open transportation options those who have lost a license.

          • conciselyverbose@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            6
            ·
            7 months ago

            Not this. It’s limited to specific scenarios on specific roads. So you’re going to need a licensed driver.

            Eventually with actually full self driving? I’d hope so, though it’s going to take legislation first.

        • jkrtn@lemmy.ml
          link
          fedilink
          English
          arrow-up
          3
          ·
          7 months ago

          Ok, then I’ll do it if I don’t have to pay for other insurance on the car.

        • explodicle@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          ·
          7 months ago

          I kinda like that system because eventually people will put their own OSes on the car, which the manufacturer obviously can’t cover. Having separate insurance/service eliminates having to pay for it if you’re accepting the liability yourself.

        • jj4211@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          7 months ago

          The conditions for the system to work are such that if you could find a policy to cover only those conditions, it’d probably just be like a couple dollars a month. Even behaving “badly” you would be unlikely to have an accident and even if you caused an accident, it’s probably just going to be a couple thousand in property damage with no medical implication.

    • AA5B@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      ·
      7 months ago

      Have you seen Tesla’s price for full self driving? And they don’t take liability

  • merthyr1831@lemmy.world
    link
    fedilink
    English
    arrow-up
    63
    ·
    7 months ago

    Love how companies can decide who has to supervise their car’s automated driving and not an actual safety authority. Absolutely nuts.

        • DreamlandLividity@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          ·
          7 months ago

          You can’t have a babysitter following every human to make sure they don’t do something dangerous. Except for high risk areas, liability is the most practical option.

            • DreamlandLividity@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              edit-2
              7 months ago

              So you want to read 50 page regulation about how to boil water in your home because boiling water can hurt people?

              And how do you regulate AI when you have no idea how it works or what could go wrong. Not as if politicians are AI experts. Driving itself is already heavily regulated, the AI has to follow traffic rules just like anyone else, if that is what you are thinking.

              • DrinkMonkey@lemmy.ca
                link
                fedilink
                English
                arrow-up
                4
                ·
                7 months ago

                Why do you believe that judges (or even juries made of lay people) can make sense of the very things that you’re so confident legislators or regulators cannot?

                I’m not saying regulation is perfect, and as a result, certainly there is a role for judicial review. But come on, man…lots of non sequiturs and straw dogs in your argument.

                • DreamlandLividity@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  7 months ago

                  Quite often, juries don’t have to rule on technical matters. Juries will have available internal communications of the company, testimonies of the engineers working on the project etc. If safety concerns were being ignored, you can usually find enough witnesses and documents proving so.

                  On the other hand, how do you even begin to regulate something that is only in the process of being invented? What would the regulation look like?

    • Trollception@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      ·
      7 months ago

      Who said there was no safety authority involved? I thought it was part of the 4 level system the government decided on for assisted driving.

    • Cosmic Cleric@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      7 months ago

      Paywalled.

      On a different subject, why would someone downvote a one-word comment that accurately describes what the content is behind?

      • stoly@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        7 months ago

        There are people who are pathologically contrarian. I’ve had to end a friendship over it—the endless need to say something negative about literally everything that ever happens and an unwillingness to be charitable to others.

      • moonpiedumplings@programming.dev
        link
        fedilink
        English
        arrow-up
        5
        ·
        7 months ago

        Because some of us have fat fingers and accidentally downvote when we scroll on mobile.

        One of the things I liked about reddit was that, since it saved downvoted posts, I could go through the list every once in a while and undownvote the accidents.

        Can’t do that here though, and I sometimes notice posts or comments I’ve accidentally downvoted.

        Anyway, people shouldn’t care so much, we don’t have a karma system or the like here anyways, so why does it matter?

        • Cosmic Cleric@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          7 months ago

          Anyway, people shouldn’t care so much, we don’t have a karma system or the like here anyways, so why does it matter?

          Well, only speaking for myself, I don’t care, it just seemed so weird since it was an accurate single word, so I was curious.

          I also wonder sometimes if it’s a bot system purposely trying to force engagement.

          Lol trust me, I get downvotes all the time for things I say here on Lemmy. If I let them bother me I’d be in the psychiatric system by now.

          Anti Commercial-AI license (CC BY-NC-SA 4.0)

        • Grippler@feddit.dk
          link
          fedilink
          English
          arrow-up
          1
          ·
          7 months ago

          Can’t do that here though

          What client are you using? I can browse both upvoted and downvoted comments in Voyager

          • moonpiedumplings@programming.dev
            link
            fedilink
            English
            arrow-up
            1
            ·
            7 months ago

            I’m using eternity, which hasn’t received any updates, on my phone, and the default lemmy web interface on my computer.

            Maybe I need to try some other options.

      • AWildMimicAppears@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        2
        ·
        7 months ago

        I have the theory that archive.is, waybackmachine and 12ft.io are no secret anymore, and that just posting “paywalled” comes across as too lazy to copy/paste or (a lot easier) to use this addon to reduce the work to a click. i dont mind, but i can understand why others might see it that way

        • Cosmic Cleric@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          7 months ago

          and that just posting “paywalled” comes across as too lazy to copy/paste

          Blaming the victim, and justifying paywalls.

          or (a lot easier) to use this addon to reduce the work to a click.

          My phone browser doesn’t use add-ons.

          i dont mind

          And yet, you took the time out to reply, to chastise me for saying it.

          Anti Commercial-AI license (CC BY-NC-SA 4.0)

          • AWildMimicAppears@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            7 months ago

            sheesh, you are quite aggressive, i did not want to offend. and as i said, i don’t mind it, i even posted the archivelink, for which you thanked me. check your target before firing, mate :-)

            (also, theres always firefox mobile. can apple users use it with addons/firefox browser engine now? i don’t follow apple development actively)

  • nucleative@lemmy.world
    link
    fedilink
    English
    arrow-up
    30
    ·
    7 months ago

    Wonder how this works with car insurance. Os there a future where the driver doesn’t need to be insured? Can the vehicle software still be “at fault” and how will the actuaries deal with assessing this new risk.

    • machinin@lemmy.world
      link
      fedilink
      English
      arrow-up
      28
      ·
      7 months ago

      I believe Mercedes takes responsibility if there is an accident while driving autonomously.

      • HobbitFoot @thelemmy.club
        link
        fedilink
        English
        arrow-up
        21
        ·
        7 months ago

        Which is how it should be. The company creating the software takes on the liability of faults with said software.

      • Rinox@feddit.it
        link
        fedilink
        English
        arrow-up
        17
        ·
        7 months ago

        Will it pull a Tesla and switch off the autopilot seconds before an accident?

          • T156@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            edit-2
            7 months ago

            If memory serves, that’s not an intentional feature, but more a coincidence, since if the driver thinks the cruise control is about to crash the car, they’ll pop the brakes. Touching the brakes disengages the cruise control by design, so you end up with it shutting down before a crash happens.

            • nucleative@lemmy.world
              link
              fedilink
              English
              arrow-up
              6
              ·
              7 months ago

              That makes perfect sense. If the driver looks up to notice that he’s in a dangerous, unfixable situation, slams the breaks, disconnecting the autopilot (which have been responaible for letting the situation develop) hopefully the automaker can’t entirely say “not our fault, the system wasn’t even engaged at the time of the collision”

      • Sizzler@slrpnk.net
        link
        fedilink
        English
        arrow-up
        13
        ·
        7 months ago

        And this is how they will push everyone into driverless. Through insurance costs. Who would insure 1 human driver vs 100 bots, (once the systems have a few billion miles on them)

        • nucleative@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          edit-2
          7 months ago

          You’re probably right. Another decade or two and human driver controlled cars might be prohibitively expensive to insure for some or even not allowed in certain areas.

          I can imagine an awesome world where that’s a great thing but also imagine a dystopian world like wall-e as well. I guess we’ll know then which one we chose.

          • bradorsomething@ttrpg.network
            link
            fedilink
            English
            arrow-up
            3
            ·
            7 months ago

            I feel you’re misapplying the advantage. Right now people hit other people in cars and insurance is what it is. It would be more appropriate to say that humans will pay normal rates, while autonomous car companies will charge you an insurance subscription, and work out much lower rates with the insurer.

            • Sizzler@slrpnk.net
              link
              fedilink
              English
              arrow-up
              3
              ·
              7 months ago

              You would think that’s how it should be right? Not a chance. They’ll find another reason to stiff you.

              • bradorsomething@ttrpg.network
                link
                fedilink
                English
                arrow-up
                2
                ·
                7 months ago

                As long as there is free competition, the cost will be around 10% over the operating cost. After that point it becomes worthwhile for another competitor to step in.

      • Hacksaw@lemmy.ca
        link
        fedilink
        English
        arrow-up
        5
        ·
        7 months ago

        No. I don’t think this is a good solution. Companies will put a price on your life and focus on monetary damage reduction. If you’re about to cause more property damage than your life is worth (to Mercedes) they’ll be incentivized to crash the car and kill you rather than crash into the expensive structure.

        Your car should be you property, you should be liable for the damage it causes. The car should prioritise your life over monetary damage. If there is some software problem causing the cars to crash, you need to be able to sue Mercedes through a class action lawsuit to recover your losses.

        • Adanisi@lemmy.zip
          link
          fedilink
          English
          arrow-up
          2
          ·
          7 months ago

          You’ve been downvoted, but I don’t get why. Are people in denial that corpos will put money above all else?

          • Hacksaw@lemmy.ca
            link
            fedilink
            English
            arrow-up
            2
            ·
            7 months ago

            Oh, there are a lot of Tesla/self driving cars fanboys out here. They’re caught up in the idea that these corporations will save us from traffic congestion/paying taxes for public transit/car accidents/climate change/car ownership/ you name it and self driving cars will solve it. They don’t tend to like it when you try to bring reality to their fantasy.

            Self driving cars are a really cool technology. Electric cars as well. However, they don’t solve the fundamental problem of transporting a 200lb person using a 3000lb vehicle. So they’re at best a partial solution. I also don’t really want a future where corporations own more of our stuff and force into monthly payments for heated car seats and “prioritise human life” premium options.

            Fanboys gonna fanboy I guess!

    • Hugin@lemmy.world
      link
      fedilink
      English
      arrow-up
      21
      ·
      7 months ago

      Berkshire Hathaway owns Geico the car insurance company. In one of his annual letters Buffett said that autonomous cars are going to be great for humanity and bad for insurance companies.

      “If [self-driving cars] prove successful and reduce accidents dramatically, it will be very good for society and very bad for auto insurers.”

      Actuaries are by definition bad at assessing new risk. But as data get collected they quickly adjust to it. There are a lot of cars so if driverless cars become even a few percent of cars on the road they will quickly be able to build good actuarial tables.

        • Hugin@lemmy.world
          link
          fedilink
          English
          arrow-up
          10
          ·
          7 months ago

          He understands there is enough competition in the market that as payouts and accidents go down premiums will have to. There is enough competition they can’t just keep rates high they would be undercut and lose customers.

          For BH it’s doubly bad as the large cash reserves GEICO has to maintain are used to borrow against at very low rates. If those reserves drop he has less to borrow against for investing.

          • bradorsomething@ttrpg.network
            link
            fedilink
            English
            arrow-up
            1
            ·
            7 months ago

            I would agree it’s bad for insurance company employees. But the purpose of an insurance company is to collect premiums and deny claims.

            Get hurt in america, your insurer will hold a demo!

            • WalrusDragonOnABike [they/them]@reddthat.com
              link
              fedilink
              English
              arrow-up
              2
              ·
              7 months ago

              When you’re clients are a handful of companies who will more aggressively change insurers than consumers to save a penny and have their own legal teams, it becomes harder to price gouge or illegally deny claims.

          • maynarkh@feddit.nl
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            7 months ago

            If I wanted to be cynical, it’s also that it’s a bit different when it’s not Average Joe asking for a payout, but Mercedes, for example. It may shift the legal playing field with the insured parties not being consumers, but car manufacturers. Even worse for insurers, car manufacturers would be more successful in negotiating the initial deal as well.

  • daikiki@lemmy.world
    link
    fedilink
    English
    arrow-up
    29
    ·
    7 months ago

    According to who? Did the NTSB clear this? Are they even allowed to clear this? If this thing fucks up and kills somebody, will the judge let the driver off the hook 'cuz the manufacturer told them everything’s cool?

    • maynarkh@feddit.nl
      link
      fedilink
      English
      arrow-up
      29
      ·
      7 months ago

      According to who? Did the NTSB clear this?

      Yes.

      If this thing fucks up and kills somebody, will the judge let the driver off the hook 'cuz the manufacturer told them everything’s cool?

      Yes, the judge will let the driver off the hook, because Mercedes told them it will assume the liability instead.

    • Trollception@lemmy.world
      link
      fedilink
      English
      arrow-up
      19
      ·
      edit-2
      7 months ago

      You do realize humans kill hundreds of other humans a day in cars, right? Is it possible that autonomous vehicles may actually be safer than a human driver?

      • KredeSeraf@lemmy.world
        link
        fedilink
        English
        arrow-up
        32
        ·
        7 months ago

        Sure. But no system is 100% effective and all of their questions are legit and important to answer. If I got hit by one of these tomorrow I want to know the process for fault, compensation and pathway to improvement are all already done not something my accident is going to landmark.

        But that being said, I was a licensing examiner for 2 years and quit because they kept making it easier to pass and I was forced to pass so many people who should not be on the road.

        I think this idea is sound, but that doesn’t mean there aren’t things to address around it.

        • Trollception@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          ·
          7 months ago

          Honestly I’m sure there will be a lot of unfortunate mistakes until computers and self driving systems can be relied upon. However there needs to be an entry point for manufacturers and this is it. Technology will get better over time, it always has. Eventually self driving autos will be the norm.

          • KredeSeraf@lemmy.world
            link
            fedilink
            English
            arrow-up
            13
            ·
            7 months ago

            That still doesn’t address all the issues surrounding it. I am unsure if you are just young and not aware how these things work or terribly naive. But companies will always cut corners to keep profits. Regulation forces a certain level of quality control (ideally). Just letting them do their thing because “it’ll eventually get better” is a gateway to absurd amounts of damage. Also, not all technology always gets better. Plenty just get abandoned.

            But to circle back, if I get hit by a car tomorrow and all these thinga you think are unimportant are unanswered does that mean I might mot get legal justice or compensation? If there isn’t clearly codified law I might not, and you might be callous enough to say you don’t care about me. But what about you? What if you got hit by a unmonitored self driving car tomorrow and then told you’d have to go through a long, expensive court battle to determine fault because no one had done it it. So you’re in and out of a hospital recovering and draining all of your money on bills both legal and medical to eventually hopefully get compensated for something that wasn’t your fault.

            That is why people here are asking these questions. Few people actually oppose progress. They just need to know that reasonable precautions are taken for predictable failures.

            • Trollception@lemmy.world
              link
              fedilink
              English
              arrow-up
              5
              ·
              edit-2
              7 months ago

              To be clear I never said that I didn’t care about an individual’s safety, you inferred that somehow from my post and quite frankly are quite disrespectful. I simply stated that autonomous vehicles are here to stay and that the technology will improve more with time.

              The legal implications of self driving cars are still being determined and as this is literally one of the first approved technologies available. Tesla doesn’t count as it’s not a SAE level 3 autonomous driving vehicle. There are some references in the liability section of the wiki.

              https://en.m.wikipedia.org/wiki/Regulation_of_self-driving_cars

          • MeDuViNoX@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            4
            ·
            7 months ago

            Can’t the entry point just be that you have to pay attention while it’s driving for you until they figure it out?

      • Adanisi@lemmy.zip
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        7 months ago

        *at 40mph on a clear straight road on a sunny day in a constant stream of traffic with no unexpected happenings, Ts&Cs apply.

      • stoly@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        7 months ago

        Only on closed courses. The best AI lacks the basic heuristics of a child and you simply can’t account for all possible outcomes.

    • Rinox@feddit.it
      link
      fedilink
      English
      arrow-up
      6
      ·
      7 months ago

      It’s not really an issue. 99.9% of the time the passengers will already be safe and the pedestrian is the one at risk. The only time I see this being an issue is if the car is already out of control, but at that point there’s little anyone can do.

      I mean, what’s the situation where a car can’t break but has enough control where it HAS to kill a pedestrian in order to save the passengers?

  • Ultragigagigantic@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    ·
    7 months ago

    if it can drive a car why wouldn’t it be able to drive a truck?

    I’m surprised companies don’t just build their own special highway for automated trucking and use people for last mile stuff.

    • ShepherdPie@midwest.social
      link
      fedilink
      English
      arrow-up
      31
      ·
      7 months ago

      Because it’s an extremely narrowly defined set of requirements in order to use it. It’s “approved freeways with clear markings and moderate to heavy traffic under 40MPH during daytime hours and clear conditions” meaning it will inch forward for you in bumper to bumper traffic provided you’re in an approved area and that’s it.

      https://www.mbusa.com/en/owners/manuals/drive-pilot

          • jj4211@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            7 months ago

            Well, not always hands on wheel. I have spent over an hour straight on an interstate with hands off. Ford’s system watches your eyes and lets your hands stay off if it’s decent conditions and on a LIDAR-mapped freeway. Note I wouldn’t trust it at night (there have been two crashes, both at night with stopped vehicles on freeway), but then I wouldn’t really trust myself at night either too much (there are many many more human caused crashes at night, I’m not sure a human at freeway speed could avoid a crash with a surprise stationary vehicle in middle of the road).

          • JasonDJ@lemmy.zip
            link
            fedilink
            English
            arrow-up
            1
            ·
            7 months ago

            In theory. In practice, it just beeps at you if your sandwich hand is steering.

      • Evotech@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        7 months ago

        Still seems not legal to not pay attention to the road. Wouldn’t fly over here at least.

    • KISSmyOSFeddit@lemmy.world
      link
      fedilink
      English
      arrow-up
      21
      ·
      7 months ago

      They got certification from the authorities, and in the event of an accident, the manufacturer takes on responsibility.

      • melpomenesclevage@lemm.ee
        link
        fedilink
        English
        arrow-up
        6
        ·
        7 months ago

        lol, ‘manufacturer takes on responsibility’ so… I’m just fucked if one of these hits me?

        see a mercedes, shoot a mercedes. destroy it in whatever way you can.

        • KISSmyOSFeddit@lemmy.world
          link
          fedilink
          English
          arrow-up
          33
          ·
          7 months ago

          No you’re guaranteed that the Mercedes that hit you is better insured for paying out your damages than pretty much anyone else on the road that could hit you.

          • melpomenesclevage@lemm.ee
            link
            fedilink
            English
            arrow-up
            10
            ·
            7 months ago

            lol corporations don’t have responsibility though. that’s the whole point of them. they’re machines for avoiding responsibility.

            • Kit
              link
              fedilink
              English
              arrow-up
              6
              ·
              7 months ago

              Please touch grass soon.

            • explodicle@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              1
              ·
              7 months ago

              In this case the responsibility to pay will ultimately fall on everyone, not just on the pedestrian getting hit. Still not good, but you won’t be SOL.

              • Fedizen@lemmy.world
                link
                fedilink
                English
                arrow-up
                3
                ·
                7 months ago

                If these have lidar (unlike teslas) then they might be better at detecting obstructions but I feel like real world road conditions are not kind to cameras and sensors.

                • QuaternionsRock@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  7 months ago

                  Fixed lidar sensors are not as reliable as it’s made out to be, unfortunately. Dome lidar systems like those found on Waymo vehicles are pretty good, but way more advanced (and expensive) than anything you’d find in consumer vehicles at the moment. The shadows of trees are enough to render basic lidar sensors useless, as they effectively produce an aperiodic square wave of infrared light (from the sun) that is frequently inseparable from the ToF emission signal. Sunsets are also sometimes enough to completely blind lidar sensors.

                  None of this is to say that Tesla’s previous camera-only approach was a good idea, like at all. More data is always a good thing, so long as the system doesn’t rely on the data more than the data’s reliability permits. After all, cameras can be blinded by sunlight too. IMO radar is the best economical complementary sensor to cameras at the moment. Despite the comparatively low accuracy, they are very reliable in adverse conditions.

          • Tankton@lemm.ee
            link
            fedilink
            English
            arrow-up
            5
            ·
            7 months ago

            The sad part of this is somehow thinking that payment solves any problem. Like, idk what they would pay me, just bring back my dead wife/child/father whatever. You can’t fix everything with money.

            • QuaternionsRock@lemmy.world
              link
              fedilink
              English
              arrow-up
              7
              ·
              7 months ago

              It only works on a small handful of freeways (read: no pedestrians) in California/Nevada, and only under 40 MPH. The odds of a crash within those parameters resulting in a fatality are quite low.

  • just_another_person@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    ·
    7 months ago

    It will be litigated almost immediately. There is no current combination of model and hardware platform that a car could reasonably run that could be called “fully self driving” at any useful speed. This thing sounds like parking assist on steroids maybe, or “stalled traffic assist”. They will be sued.

    • explodes@lemmy.world
      link
      fedilink
      English
      arrow-up
      30
      ·
      7 months ago

      Did you read the article? There are already plenty of conditions for activating the self driving mode.

    • cm0002@lemmy.world
      link
      fedilink
      English
      arrow-up
      22
      ·
      7 months ago

      There’s tons of conditions

      when certain conditions are met, including in heavy traffic jams, during the daytime, on spec ific California and Nevada freeways, and when the car is traveling less than 40 mph. Drivers can focus on other activities until the vehicle alerts them to resume control.

      I doubt this is a mistake, they must have really high confidence in the tech as well as with the restrictions, not even Tesla had the balls to announce that you could drive distracted.

      • RvTV95XBeo@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        3
        ·
        7 months ago

        I mean I disagree with most of what the person you’re responding to is saying, but they are entering into a new stage of vehicular liability. By telling the driver they don’t have to pay attention there is an implied transfer of liability.

        It probably says somewhere in the terms of use that Mercedes isn’t at fault or that you have to carry some special kind of insurance, and frankly computers have a pretty good shot at being better than your average human driver so they’ll hopefully be easier to insure, but nevertheless, people on both sides of every accident for the first few years with this tech will sue. Any chance to squeeze a few milly out of a 100 billion dollar car company.

        • VelociCatTurd@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          7 months ago

          Sure, anyone can sue for any reason. That doesn’t mean that a case will be successful. I do agree with you that there if a transfer of liability, until the car tells the driver that manual intervention is needed. But also, this can be used on only specific roads, under specific weather and traffic conditions, I really don’t think it’s much to ask of a robot to do. It actually seems like a pretty boring level of autonomy.

  • elrik@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    ·
    7 months ago

    How is this different from the capabilities of Tesla’s FSD, which is considered level 2? It seems like Mercedes just decided they’ll take on liability to classify an equivalent level 2 system as level 3.

    • rsuri@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      ·
      edit-2
      7 months ago

      According to the mercedes website the cars have radar and lidar sensors. FSD has radar only, but apparently decided to move away from them and towards optical only, I’m not sure if they currently have any role in FSD.

      That’s important because FSD relies on optical sensors only to tell not only where an object is, but that it exists. Based on videos I’ve seen of FSD, I suspect that if it hasn’t ingested the data to recognize, say, a plastic bucket, it won’t know that it’s not just part of the road (or at best can recognize that the road looks a little weird). If there’s a radar or lidar sensor though, those directly measure distance and can have 3-D data about the world without the ability to recognize objects. Which means they can say “hey, there’s something there I don’t recognize, time to hit the brakes and alert the driver about what to do next”.

      Of course this still leaves a number of problems, like understanding at a higher level what happened after an accident for example. My guess is there will still be problems.

    • GoodEye8@lemm.ee
      link
      fedilink
      English
      arrow-up
      10
      ·
      7 months ago

      You’ve inadvertently pointed out how Tesla deliberately skirts the law. Teslas are way more capable than what level 2 describes, but they choose to stay as level 2 so they wouldn’t have to take responsibility for their public testing

    • Socsa@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      4
      ·
      7 months ago

      Yeah it’s pretty much an insurance product. They came up with a set of boundary conditions someone would underwrite for their “stay between the lines” tech.

    • philpo@feddit.de
      link
      fedilink
      English
      arrow-up
      1
      ·
      7 months ago

      It’s not about the sensors, it’s about the software. That’s the solution.

      • skyspydude1@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        7 months ago

        Please tell me how software will be able to detect objects in low/no-light conditions if they say, have cameras with poor dynamic range and no low-light sensitivity?