In the piece — titled “Can You Fool a Self Driving Car?” — Rober found that a Tesla car on Autopilot was fooled by a Wile E. Coyote-style wall painted to look like the road ahead of it, with the electric vehicle plowing right through it instead of stopping.

The footage was damning enough, with slow-motion clips showing the car not only crashing through the styrofoam wall but also a mannequin of a child. The Tesla was also fooled by simulated rain and fog.

  • Simulation6@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    1
    ·
    16 hours ago

    If the disengage to avoid legal consequences feature does exist, then you would think there would be some false positive incidences where it turns off for no apparent reason. I found some with a search, which are attributed to bad software. Owners are discussing new patches fixing some problems and introducing new ones. None of the incidences caused an accident, so maybe the owners never hit the malicious code.

    • FuglyDuck@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 hours ago

      if it randomly turns off for unapparent reasons, people are going to be like ‘oh that’s weird’ and leave it at that. Tesla certainly isn’t going to admit that their code is malicious like that. at least not until the FBI is digging through their memos to show it was. and maybe not even then.

    • Dultas@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      9 hours ago

      I think Mark (who made the OG video) speculated it might be the ultrasonic parking sensors detecting something and disengaging.