Tesla Whistleblower Says ‘Autopilot’ System Is Not Safe Enough To Be Used On Public Roads::“It affects all of us because we are essentially experiments in public roads.”

  • @linearchaos
    link
    English
    -36 months ago

    Unfortunately this is one of those things that you can’t significantly develop/test on closed private streets. They need the scale, and the public traffic, and the idiots in the drunkards and the kids speeding. The only thing that’s going to stop them from working on autopilot will be that it’s no longer financially reasonable to keep going. Even a couple handfuls of deaths aren’t going to stop them.

    • @Ottomateeverything
      link
      English
      346 months ago

      Unfortunately this is one of those things that you can’t significantly develop/test on closed private streets.

      Even if we hold this to be true (and I disagree in large part), the point is that Tesla’s systems aren’t at that stage yet. Failing to recognize lights correctly during live demos and such are absolutely things you can test and develop on closed streets or in a lab. Tesla’s shouldn’t be allowed on roads until they’re actually at a point where there are no glaring flaws. And then they should be allowed in smaller numbers.

      • @linearchaos
        link
        English
        -21
        edit-2
        6 months ago

        Do you really think they didn’t test that before they got to this point?

        I’m willing to bet they had been through that intersection before hundreds of times and never seen this. It’s not like it can’t detect a stoplight and they’re just out there randomly running through them all.

        Of the millions of variables that were around them something blinded it to light this time. The footage from that run has probably been reviewed at nauseam at this point and is done more for them finding the problem than they could have done sitting in a closed warehouse making guesses when the car never fails to detect a red light.

        edit: look keep smacking that downvote, but it’s not going to change anything. I hate musk too, but we’re going to make progress toward automated driving unless it becomes more dangerous than existing driver. In the next generation or so, most driving will become automated and all deaths by automobiles will drop significantly. Old and young people will get where they need to go. You cannot automate driving without driving in the real world. If you think they haven’t been doing this in a simulation for a decade, you’re on crack.

        • @pivot_root
          link
          English
          236 months ago

          I still wouldn’t trust the company with a CEO who unilaterally decided that not having redundant systems makes for a better product.

          • @linearchaos
            link
            English
            16 months ago

            I absolutely don’t trust the CEO. I don’t even need to trust the company, there are a dozen others trying to work out the same problem.

        • Tar_Alcaran
          link
          fedilink
          English
          166 months ago

          Do you really think they didn’t test that before they got to this point?

          Yes.

          • @linearchaos
            link
            English
            -86 months ago

            well if you’re not going to discuss things in good faith, good bye

    • @TheGrandNagus
      link
      English
      10
      edit-2
      6 months ago

      That’s true, but I think the issue people have with “AutoPilot” is about marketing.

      Tesla brands their cars’ solution as being a full replacement for human interaction and word from Musk, other Tesla employees, media personalities close to Tesla, and fanboys all make out like the car drives itself and the only reason you need a driver in place is to satisfy laws.

      It’s bullshit. They know exactly what they’re doing when they do the above, when they call their system “AutoPilot”, when Musk makes claims his cars can travel from one side of the US to the other without human interaction (only to never actually do it, of course!), and sells car upgrades as Full Self Driving support.

      If they branded it as Assisted Driving, Advanced Cruise Control, Smart Cruise, or something along those lines, like all the other carmakers do with their similar systems, I’d be less inclined to blame Tesla when there’s an unfortunate incident. I think most would agree with me, too.

      But Tesla markets and encourages, both officially and unofficially, that their cars have the ability to drive themselves, look after themselves, and that you’re safe when using the system. It’s a lie and I’m absolutely astounded they’ve had little more than a series of slaps on the wrist for it in most markets.

      • @linearchaos
        link
        English
        -26 months ago

        100% accurate.

        They want people to use it so they get data from it. Accidents and deaths will happen… honestly, they’ll always happen… they happen now without it, it’s just more acceptable because it’s human error. Road safety is absolutely awful.

        The reason they get away with it is Lobbying, Money and Political favors. They got where they are by greasing a whole shit ton of wheels with dumptrucks of money.

        Shitty means, but pretty righteous ways.

      • @Takumidesh
        link
        English
        -16 months ago

        Tbf, Tesla’s are the only cars that actually know you are on your phone and/or not paying attention.

        • Flying Squid
          link
          English
          46 months ago

          And then make you look at and use a giant touchscreen to control anything, taking your attention off the road.

        • Zoolander
          link
          English
          3
          edit-2
          6 months ago

          Removed by mod

          • @Takumidesh
            link
            English
            16 months ago

            So it is true, but it wasn’t true in the past? It can also be disabled by breaking the camera or putting tape over it. The point is that it does do that, which is a true statement.

    • @[email protected]
      link
      fedilink
      English
      06 months ago

      Should a couple handfuls of deaths if as you said you can’t test it any other way? Autopilot systems could already be saving thousands of lives if more widely deployed and a lack of good reliable autopilot systems has the opportunity cost of blood on our hands. Human drivers are well established to be dangerous. Testing and release of autopilot systems should be done as safely as possible, but to think the first decade or so of these systems will be flawless seems unreasonable.

      • @linearchaos
        link
        English
        -56 months ago

        Same happened with airplanes

    • gregorum
      link
      fedilink
      English
      -36 months ago

      The fact is that most technology that we take for granted today went through a similar evolutionary phase with public use before they became as safe as they are now, especially cars themselves. For well over a century, the automobile has made countless leaps and bounds in safety improvements due to data gathered from public use studies.

      We learn by doing.

      • kingthrillgore
        link
        fedilink
        English
        46 months ago

        That’s fine but Waymo, Cruise et al do trials on closed courses and in co-operation with states to assure a high degree of public safety. Tesla is testing without asking regulators.

        • gregorum
          link
          fedilink
          English
          26 months ago

          Do they? I actually, and honestly, have very little to no knowledge of how companies gather, which is why I did not mention them. Can you provide any links to any information about them? I honestly would like to learn more.