• catloaf@lemm.ee
    link
    fedilink
    English
    arrow-up
    42
    arrow-down
    2
    ·
    19 hours ago

    I’ve heard they also like to disengage self-driving mode right before a collision.

    • sylver_dragon@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      16
      ·
      19 hours ago

      That actually sounds like a reasonable response. Driving assist means that a human is supposed to be attentive to take control. If the system detects a situation where it’s unable to make a good decision, dumping that decision on the human in control seems like the closest they have to a “fail safe” option. Of course, there should probably also be an understanding that people are stupid and will almost certainly have stopped paying attention a long time ago. So, maybe a “human take the wheel” followed by a “slam the brakes” if no input is detected in 2-3 seconds. While an emergency stop isn’t always the right choice, it probably beats leaving a several ton metal object hurtling along uncontrolled in nearly every circumstance.

      • zaphod@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        51
        arrow-down
        3
        ·
        edit-2
        19 hours ago

        That actually sounds like a reasonable response.

        If you give the driver enough time to act, which tesla doesn’t. They turn it off a second before impact and then claim it wasn’t in self-driving mode.

        • whotookkarl@lemmy.world
          link
          fedilink
          English
          arrow-up
          14
          arrow-down
          1
          ·
          16 hours ago

          Not even a second, it’s sometimes less than 250-300ms. If I wasn’t already anticipating it to fail and disengage as it went though the 2-lane wide turn I would have gone straight into oncoming traffic

      • nthavoc@lemmy.today
        link
        fedilink
        English
        arrow-up
        12
        ·
        18 hours ago

        So, maybe a “human take the wheel” followed by a “slam the brakes” if no input is detected in 2-3 seconds.

        I have seen reports where Tesla logic appears as “Human take the wheel since the airbag is about to deploy in the next 2 micro seconds after solely relying on camera object detection and this is totally YOUR fault, kthxbai!” If there was an option to allow the bot to physically bail out of the car as it rolls you onto the tracks while you’re still sitting in the passenger seat, that’s how I would envision how this auto pilot safety function works.

      • elucubra@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        edit-2
        19 hours ago

        I don’t know if that is still the case, but many electronic stuff in the US had warnings, with pictures, like “don’t put it in the bath”, and the like .

        People are dumb, and you should take that into account.

    • GreenBottles@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      15
      ·
      edit-2
      19 hours ago

      That sounds a lot more like a rumor to me… it would be extremely suspicious and would leave them open to GIGANTIC liability issues.

      • catloaf@lemm.ee
        link
        fedilink
        English
        arrow-up
        34
        ·
        19 hours ago

        In the report, the NHTSA spotlights 16 separate crashes, each involving a Tesla vehicle plowing into stopped first responders and highway maintenance vehicles. In the crashes, it claims, records show that the self-driving feature had “aborted vehicle control less than one second prior to the first impact”

        https://futurism.com/tesla-nhtsa-autopilot-report

      • sem@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        18
        arrow-down
        1
        ·
        19 hours ago

        It’s been well documented. It lets them say in their statistics that the owner was in control of the car during the crash

      • ayyy@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        2
        ·
        edit-2
        19 hours ago

        How so? The human in the car is always ultimately responsible when using level 3 driver assists. Tesla does not have level 4/5 self-driving and therefore doesn’t have to assume any liability.

        • Pika@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          18 hours ago

          This right here is another fault in regulation that eventually will catch up because Especially with level three where it’s primarily the vehicle driving and the driver just gives periodic input It’s not the driver that’s in control most of the time. It’s the vehicle so therefore It should not be the driver at fault

          Honestly, I think everything up to level two should be drivers at fault because those levels require a constant driver’s input. However, level three conditional driving and higher should be considered liability of the company unless the company can prove that the autonomous control, handed control back to the driver in a human-capable manner (i.e Not within the last second like Tesla currently does)