Tesla Whistleblower Says ‘Autopilot’ System Is Not Safe Enough To Be Used On Public Roads::“It affects all of us because we are essentially experiments in public roads.”

  • @[email protected]
    link
    fedilink
    English
    31 year ago

    Doesn’t seem too complicated… if ANY of the sensors see something in the way that the system can’t resolve then it should stop the vehicle/force the driver to take over

    • @[email protected]
      link
      fedilink
      English
      2
      edit-2
      1 year ago

      Then you have a very unreliable system, stopping without actual reason all the time, causing immense frustration for the user. Is it safe? I guess, cars that don’t move generally are. Is it functional? No, not at all.

      I’m not advocating unsafe implementations here, I’m just pointing out that your suggestion doesn’t actually solve the issue, as it leaves a solution that’s not functional.

      • @[email protected]
        link
        fedilink
        English
        211 months ago

        If they’re using such unreliable sensors that they’re getting false positives all the time the system isn’t going to be functional in the first place.

        • @[email protected]
          link
          fedilink
          English
          2
          edit-2
          11 months ago

          All sensors throw a shitload of false positives (or negatives) when used in the real world, that’s why the filtering and unification between sensors is so important, but also really hard to solve, while still getting a consistent and reliable solution.

    • Kogasa
      link
      fedilink
      English
      111 months ago

      “seeing an obstacle” is a high level abstraction. Sensor fusion is a lower level problem. It’s fundamentally kinda tricky to get coherent information out of multiple sensors looking partially at the same thing in different ways. Not impossible, but the basic model is less “just check each camera” and more sheafs