Mark Rober just set up one of the most interesting self-driving tests of 2025, and he did it by imitating Looney Tunes. The former NASA engineer and current YouTube mad scientist recreated the classic gag where Wile E. Coyote paints a tunnel onto a wall to fool the Road Runner.

Only this time, the test subject wasn’t a cartoon bird… it was a self-driving Tesla Model Y.

The result? A full-speed, 40 MPH impact straight into the wall. Watch the video and tell us what you think!

  • happydoors@lemm.ee
    link
    fedilink
    English
    arrow-up
    14
    ·
    3 hours ago

    I love that one of the largest YouTubers is the one that did this. Surely, somebody near our federal government will throw a hissy fit if he hears about this but Mark’s audience is ginormous

    • buddascrayon@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      21 minutes ago

      Honestly I think Mark should be more scared of Disney coming after him for mapping out their space mountain ride.

  • conicalscientist@lemmy.world
    link
    fedilink
    English
    arrow-up
    29
    ·
    4 hours ago

    Anyone with half a brain could tell you plain cameras is a non-starter. This is nearly a Juicero level blunder. Tesla is not a serious car company nor tech company. If markets were rational it would have been the end for Tesla.

    • LifeInMultipleChoice@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      13
      ·
      edit-2
      3 hours ago

      Austin should just pull the permits until all the taxis have lidar installed and tested. Or write a bill that fines the manufacturer $100 billion for any self driving car that kills a person and puts the proceeds 50% to the family and 50% to infrastructure. One of the first rules of robotics was always about not harming humans.

    • KayLeadfoot@fedia.ioOP
      link
      fedilink
      arrow-up
      3
      ·
      1 hour ago

      He is studiously apolitical, the only political comment I could find from him was the very sensible advice that we need to tone down our hyperpartisanship :)

      https://x.com/MarkRober/status/1641487680168153089?lang=en

      For me, I criticize any vehicle that is objectively crappy… and some vehicles where I find them subjectively crappy… and I hope folks don’t assume I’m doing that because of my political leanings.

  • rational_lib@lemmy.world
    link
    fedilink
    English
    arrow-up
    33
    ·
    edit-2
    5 hours ago

    The rain test was far more concerning because it’s much more realistic of a scenario. Both a normal person and the lidar would’ve seen the kid and stopped, but the cameras and image processing just isn’t good enough to make out a person in the rain. That’s bad. The test portrays it as a person in the middle of a straight road, but I don’t see why the same thing wouldn’t happen at a crosswalk or other place where pedestrians are often in the path of a vehicle. If an autonomous system cannot make out pedestrians in the rain reliably, that alone should be enough to prevent these vehicles from being legal.

    • LifeInMultipleChoice@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 hours ago

      The question there would be does Austin have crosswalks that don’t have red lights. Many places put a light at every cross walk, but not all. Most beaches don’t have them at every crosswalk, they just have laws that if someone is in or entering the crosswalk you have to stop for the pedestrians. (They would all be at risk from what you are saying).

      • Tot@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        3 hours ago

        Not every pedestrian follows the rules of the lights though. And not every pedestrian makes it across the road in time before the light changes colors from red to green.

  • King3d@lemmy.world
    link
    fedilink
    English
    arrow-up
    26
    ·
    6 hours ago

    This is like the crash on a San Francisco bridge that happened because of a Tesla that went into a tunnel and it wasn’t sure what to do since it went from bright daylight to darkness. In this case the Tesla just suddenly merged lanes and then immediately stopped and caused a multi car pile up.

    • fallingcats@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      17
      ·
      6 hours ago

      You’d think they have cameras with higher dynamic range and faster auto exposure in their cars by now. Nope, still penny pinching.

      • Strawberry
        link
        fedilink
        English
        arrow-up
        11
        ·
        6 hours ago

        If only elon hadn’t insisted on not using lidar or anything other than just visible light cameras

        • Mic_Check_One_Two@reddthat.com
          link
          fedilink
          English
          arrow-up
          2
          ·
          2 hours ago

          Yeah, pulling radar from the cars was the beginning of the end. Early teslas had radar, and that was what led to all of the “car sees something three vehicles ahead and brakes to avoid a pileup that hasn’t even started yet” type of collision avoidance videos. First, pulling radar was a cost cutting thing. Then Elon demanded that they pull out the lidar too, and that’s when their crash numbers skyrocketed.

        • fallingcats@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          1
          ·
          5 hours ago

          Even then the cameras aren’t really up to snuff to do the kind of things you need cameras for, like signs, lane markings, traffic lights etc.

  • fubarx@lemmy.world
    link
    fedilink
    English
    arrow-up
    40
    ·
    7 hours ago

    There’s a very simple solution to autonomous driving vehicles plowing into walls, cars, or people:

    Congress will pass a law that makes NOBODY liable – as long as a human wasn’t involved in the decision making process during the incident.

    This will be backed by car makers, software providers, and insurance companies, who will lobby hard for it. After all, no SINGLE person or company made the decision to swerve into oncoming traffic. Surely they can’t be held liable. 🤷🏻‍♂️

    Once that happens, Level 4 driving will come standard and likely be the default mode on most cars. Best of luck everyone else!

    • hedge_lord@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      5 hours ago

      Kids already have experience playing hopscotch, so we can just have them jump between the rooves of moving cars in order to cross the street! It will be so much more efficient, and they can pretend that they are action heroes. The ones who survive will make for great athletes too.

    • chilicheeselies@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      ·
      7 hours ago

      There is no way insurance companies would go for that. What is far more likely is that policies simply wont cover accidents due to autonomous systems. Im honeslty surprised they wouls cover them now.

      • ThePantser@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        4
        ·
        5 hours ago

        If it’s a feature of a car when you bought it and the insurance company insured the car then anything the car does by design must be covered. The only way an insurance company will get out of this is by making the insured sign a statement that if they use the feature it makes their policy void, the same way they can with rideshare apps if you don’t disclose that you are driving for a rideshare. They also can refuse to insure unless the feature is disabled. I can see in the future insurance companies demanding features be disabled before insuring them. They could say that the giant screens blank or the displayed content be simplified while in motion too.

      • P03 Locke@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        6
        ·
        7 hours ago

        What is far more likely is that policies simply wont cover accidents due to autonomous systems.

        If the risk is that insurance companies won’t pay for accidents and put people on the hook for hundreds of thousands of dollars in medical bills, then people won’t use autonomous systems.

        This cannot go both ways. Either car makers are legally responsible for their AI systems, or insurance companies are legally responsible to pay for those damages. Somebody has to foot the bill, and if it’s the general public, they will avoid the risk.

      • rumba@lemmy.zip
        link
        fedilink
        English
        arrow-up
        2
        ·
        6 hours ago

        Not sure how it plays for Tesla, but for Waymo, their accidents per mile driven are WAY below non-automation. Insurance companies would LOVE to charge a surplus for automated driving insurance while paying out less incidents.

    • Ulrich@feddit.org
      link
      fedilink
      English
      arrow-up
      2
      ·
      6 hours ago

      Once that happens, Level 4 driving will come standard

      Uhhhh absolutely not. They would abandon it first.

    • Korhaka@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      1
      ·
      7 hours ago

      If no one is liable then it’s tempting to deliberately confuse them to crash

  • pjwestin@lemmy.world
    link
    fedilink
    English
    arrow-up
    30
    ·
    8 hours ago

    To be fair, the roadrunner it was following somehow successfully ran into the painting.

  • Possibly linux@lemmy.zip
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    3 hours ago

    The question is could this fool a human

    Also I went and watched the video and he doesn’t seem to even use full self driving for the wall test

      • ChaoticNeutralCzech@feddit.org
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        35 minutes ago

        Many people tend to doze off so much they would absolutely get fooled. I admit I might, too, especially if the wall is made of a material that needs no guy wires to prop it up. They either used digital effects or a very good color grading job, it’s uncanny.

        relevant

      • Possibly linux@lemmy.zip
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 hours ago

        Are you sure though?

        If you knew to expect a wall it is pretty obvious but if you aren’t expecting a wall it might prove confusing.

        I probably would stop either way.

        • chaogomu@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 hours ago

          I watched the video. The wall would not fool a human with object permanence.

          Anyone who is fooled, is likely impaired enough that they are not legal to drive.

  • ikidd@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    7 hours ago

    To be fair, I’d be surprised if half the humans driving didn’t do the same.

    • osugi_sakae@midwest.social
      link
      fedilink
      English
      arrow-up
      3
      ·
      5 hours ago

      this. watching the video, I had some trouble telling the difference. sure, from some angles it is obvious, but from others it is not.

      That said, other cars, with more types of sensors, would probably have “seen” the obstruction on the road.

      • Mic_Check_One_Two@reddthat.com
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 hours ago

        That said, other cars, with more types of sensors, would probably have “seen” the obstruction on the road.

        Well yeah, that’s sort of the entire point of the video. He ran the test with a lidar-equipped vehicle, and it saw the wall right away. Hell, a radar-equipped car (like early teslas) probably would have seen the “kid” behind the wall as well. But since Musk has decided that cars should be able to self-drive with only cameras, the newer teslas will just plow straight into the wall without braking.

      • hardcoreufo@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 hours ago

        I would have liked to see how a more typical car with automatic cruise control /braking functioned. I think they use ultra sonic sensors and would have done better than the Tesla.

      • Possibly linux@lemmy.zip
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 hours ago

        I think the human brain hasthe edge in processing visuals since out brains are so much better at adapting than any computer system. We can improvise much better since we have our whole life experience to draw on.