A team of researchers from prominent universities – including SUNY Buffalo, Iowa State, UNC Charlotte, and Purdue – were able to turn an autonomous vehicle (AV) operated on the open sourced Apollo driving platform from Chinese web giant Baidu into a deadly weapon by tricking its multi-sensor fusion system, and suggest the attack could be applied to other self-driving cars.

  • @EvilBit@lemmy.world
    link
    fedilink
    English
    8825 days ago

    https://xkcd.com/1958/

    TL;DR: faking out a self-driving system is always going to be possible, and so is faking out humans. But doing so is basically attempted murder, which is why the existence of an exploit like this is not interesting or new. You could also cut the brake lines or rig a bomb to it.

      • Eggyhead
        link
        fedilink
        525 days ago

        I was so close to finishing, too. Time to look for another doomsday thread, I guess.

    • @Beryl@lemmy.world
      link
      fedilink
      English
      4
      edit-2
      25 days ago

      You don’t even have to rig a bomb, a better analogy to the sensor spoofing would be to just shine a sufficiently bright light in the driver’s eyes from the opposite side of the road. Things will go sideways real quick.

      • @EvilBit@lemmy.world
        link
        fedilink
        English
        224 days ago

        It’s not meant to be a perfect example. It’s a comparable principle. Subverting the self-driving like that is more or less equivalent to any other means of attempting to kill someone with their car.

        • @Beryl@lemmy.world
          link
          fedilink
          English
          424 days ago

          I don’t disagree, i’m simply trying to present a somewhat less extreme (and therefore i think more appealing) version of your argument

    • @uriel238
      link
      English
      424 days ago

      More exciting would be an exploit that renders an unmoving car useless. But exploits like this absolutely will be used in cases were tire-slashing might be used, such as harassing genocidal vips or disrupting police services, especially if it’s difficult to trace the drone to its controller.

  • @Infynis@midwest.social
    link
    fedilink
    English
    41
    edit-2
    25 days ago

    This is the real reason Elon Musk doesn’t want people tracking his plane. If we know where he is, Wile E Coyote could catch up to him and trick his car into crashing into a brick wall, by painting a tunnel on it

    • Jesus
      link
      fedilink
      English
      1224 days ago

      Wait until you see what my uncle Jerry can do with a 5th of vodka and his Highlander.