A judge has found “reasonable evidence” that Elon Musk and other executives at Tesla knew that the company’s self-driving technology was defective but still allowed the cars to be driven in an unsafe manner anyway, according to a recent ruling issued in Florida.

Palm Beach county circuit court judge Reid Scott said he had found evidence that Tesla “engaged in a marketing strategy that painted the products as autonomous” and that Musk’s public statements about the technology “had a significant effect on the belief about the capabilities of the products”.

The ruling, reported by Reuters on Wednesday, clears the way for a lawsuit over a fatal crash in 2019 north of Miami involving a Tesla Model 3. The vehicle crashed into an 18-wheeler truck that had turned on to the road into the path of driver Stephen Banner, shearing off the Tesla’s roof and killing Banner.

  • bedrooms
    link
    fedilink
    42
    edit-2
    7 months ago

    The concept of autonomous cars might be game over.

    As always, advocates forgot about corporate greed. Do you trust your manufacturer to not lie to you? So much you risk killing yourself, your family and people on the road?

    • @Ottomateeverything
      link
      377 months ago

      Yeah, the scary part of this is that as much as I absolutely would never go near this shit with a ten foot pole when it’s clearly still woefully inadequate and over hyped… They very frequently drive withing ten feet of me because for some reason it’s legal to put this shit on roads with unwilling participants.

      • @RedditRefugee69
        link
        -34
        edit-2
        7 months ago

        I get it there’s inevitable interference of interest here but we can’t really tell other people to not do things we don’t like in a free country

        Edit: this is clearly being misinterpreted. I am NOT talking about the Tesla. I’m saying a hypothetical, well-regulated self-driving car can be fielded without the permission of every other motorist that thinks they’re icky.

        • @jopepa
          link
          24
          edit-2
          7 months ago

          Yes, we can tell people they can’t do things. Welcome to society we’ve all been talking and decided on a bunch of things people can’t do in a free country. It’s public roads, it’s entirely reasonable to have restrictions on self driving cars, just like you can’t ride a tandem bicycle in the HOV lane.

        • @[email protected]
          link
          fedilink
          English
          207 months ago

          People get fined for having unsafe vehicles on public roads all the time. All that’s needed here is a regulatory body to decide self-driving cars are unsafe enough to revoke approval.

          • @RedditRefugee69
            link
            -157 months ago

            Oh hell yeah if it’s unsafe. I’m making the finer point that saying “you don’t have the right to drive that car next to me cuz it makes me feel weird” is overstepping

            • @[email protected]
              link
              fedilink
              English
              137 months ago

              I’m pretty sure the actual concern has less to do with “feeling weird” and more “because it and/or its inattentive driver may suddenly kill me” because of a dysfunctional self-driving system whose capabilities has been fraudulently marketed and has, in reality, repeatedly, killed people.

              • @RedditRefugee69
                link
                -9
                edit-2
                7 months ago

                They said “for some reason it’s legal to push [self driving cars in general] on unwilling participants”. That’s what I’m addressing

          • @RedditRefugee69
            link
            -9
            edit-2
            7 months ago

            Yeah but that is beyond what anyone would consider reasonable

              • @RedditRefugee69
                link
                -37 months ago

                I’m saying that you don’t need everyone else’s permission to drive a safety regulated self-driving car. That’s it. I’m not talking about the Tesla

                • @marx2k
                  link
                  17 months ago

                  deleted by creator

              • @RedditRefugee69
                link
                -2
                edit-2
                7 months ago

                Dude. You’re clearly not understanding the nuance of my point

                • @jopepa
                  link
                  27 months ago

                  lol the no step on snek guy is complaining about nuance while misquoting everyone.

        • @NewNewAccount
          link
          67 months ago

          in a free country

          What do you mean when you say this?

          • @RedditRefugee69
            link
            -47 months ago

            I am referring to America which prides itself on freedom (and not enough on equality and collectivism) . I’m just saying it makes legal sense that you don’t need the consent of every other motorist to operate a self-driving car (if it passes safety regulations and assuming no problems of regulatory capture). Both of those assumptions are not applicable here

            • @Nudding
              link
              -17 months ago

              How can you take pride in freedom while simultaneously having the most non violent offenders of any state to ever exist, in terms of raw numbers and per capita?

        • @Fedizen
          link
          6
          edit-2
          7 months ago

          truth in advertising laws exist for a reason

          also the people who frequently talk about a “free country” are often the same ones that want more police so they can do taliban style gender policing so it (the expression) seems deeply inauthentic at this point.

          • @RedditRefugee69
            link
            -17 months ago

            True about “free country” being used to justify a society controlled by extreme wealth. And I’m talking about another persons right to “drive” a self-driving car next to me. Not about these guys objectively being criminally ass-hole-y

    • @Fedizen
      link
      8
      edit-2
      7 months ago

      Autonomous vehicles ten years ago: Human drivers are slow and prone to lapses in judgement

      Autonomous vehicles today: Elon musk, a guy who famously destroyed a rare vehicle like a dumbass, will be training the AI that drives you around. It won’t know how to respond to an event not encountered in the training data and it will occasionally run into an ambulance

      • @jopepa
        link
        57 months ago

        And his employees hate him so much I wouldn’t be surprised if there’s a patch released that makes one sustained fart noise when airbags deploy.

    • @[email protected]
      link
      fedilink
      English
      47 months ago

      Maybe at least until there’s a better comprehensive infrastructure of external sensors on the road, at intersections, etc etc etc, to control and limit vehicle movement, but that probably will be a long while before getting those improvements considering normal routine road and bridge maintenance is far behind as it is.

    • @[email protected]
      link
      fedilink
      27 months ago

      To me, autonomous vehicles are like AI (it actually is AI in the case of Tesla): the public perception is that it’s way better than it really is because it’s really good in 80% of cases. But to get to 90-95% will take many many years still. That doesn’t mean we shouldn’t use them, neither abandon them. To progress, we have to keep using them with caution. Learn the limits and work within it. Don’t start firing people to be replaced with AI because in a few months and years you’ll realize that the 20% left to improve will be hurting more than your thought. The same way you shouldn’t remove drivers just yet.

      • @IphtashuFitz
        link
        English
        3
        edit-2
        7 months ago

        But it’s not true AI. In my decades of experience driving cars I’ve encountered numerous edge cases that I never explicitly learned about during my drivers ed days. One recent case in point - I pulled up to a red light at a fairly busy intersection and stopped. While the light was still red a police officer on the corner at a construction site walked out and tried to wave me through the intersection. I was watching the red light so I didn’t even see him until he yelled at me.

        How would an autonomous AI car handle that situation if it’s not explicitly trained to recognize it? It would need to recognize the police officer as an authority that legitimately overrides the red light.

        Same intersection a few years earlier I saw a car engulfed in flames right in the middle of it. I saw & heard the fire trucks rapidly approaching as I got to the intersection. I, and others, realized we needed to get out of the way quickly. Would a Tesla AI(or any other) recognize the car is on fire and safely move away, or would it just recognize the shape of the car and patiently wait for it to move out of the intersection before proceeding?

        The point is that it’s virtually impossible to predict for, and program an AI to handle, every single situation it might ever encounter. A true AI would be trained on a lot of these sorts of scenarios but would need to be capable of recognizing edge cases it hasn’t encountered before as well. It would then need to react as safely as possible to those edge cases in a manner similar to how a human would.

        Edit: Downvotes must be from Tesla fanbois who can’t face reality. If the had legitimate arguments they would have replied…

        • @[email protected]
          link
          fedilink
          17 months ago

          This is why AI is a solution, not coding everything. How does one learn how to react in these situations? Either you’ve learned from watching your parents, by taking lessons, reading the code or by simply following the others. The goal of an AI is to be able to do just that. Coding every single use case is way too complex.

          I know Tesla has worked on improving emergency vehicles situations, but I don’t know how and what’s the current state.

          Why are you being downvoted?

    • @[email protected]
      link
      fedilink
      27 months ago

      I think you need protected ways where no people or non autonomous vehicles may enter. Shy of that, I think you’re right.

          • @[email protected]
            link
            fedilink
            17 months ago

            Yes but when you said sounded like we needed to make protected ways. It should be them making protected ways.

            • @[email protected]
              link
              fedilink
              17 months ago

              I’m saying that this is stupid technology that will only work if you separate it from the public.

    • @[email protected]
      link
      fedilink
      English
      27 months ago

      Hell yeah, let these drivers behind the wheel plow into more semi trucks. They deserve it after all. /s

    • @Death_Equity
      link
      17 months ago

      The Wright brothers first flight was less than the wingspan of a Boeing 747, an aircraft with a rang of over 8,000 miles. The Internet was once called a fad.

      Autonomous cars will be the future and people will die before they become the defacto method of personal transport. The unwilling sacrifices of a public alpha test of the technology are worth the losses we must endure to achieve the unparalleled safety of ubiquitous autonomous vehicles that mitigate traffic congestion, pedestrian deaths, unwieldy public transit, and the shortcomings of urban sprawl.

      The deaths caused by early adoption benefit the greater good and we should be willing to accept their loss as a necessary evil for a greater good.

      Not that I would ever trust a computer to drive my car. I will drive my own car until it kills me, financially or literally, but I can see what good an imperfect system struggling with growing pains will create.

  • @[email protected]
    link
    fedilink
    297 months ago

    Its definitly insane how hearly Tesla started selling their “self driving” cars. the fact that there are cars that paid for self driving, and then never got more than a Level 3 system is insane.

    • @polygon6121
      link
      10
      edit-2
      7 months ago

      Even the beta is considered a level 2 system still. Level 3 would require the system to conditionally take over in certain situations, you will quickly win a Darwin award if you consistently trusted the fsd for any given situation.

  • bedrooms
    link
    fedilink
    197 months ago

    Maybe they should start a new motor racing series where autonomous cars race 24 hours in 80km/h with random people walking on the circuit. Then we can trust autonomous cars.

      • bedrooms
        link
        fedilink
        97 months ago

        Ha, was thinking of Le Mans. Actually, they started Le Mans because, in 1923, cars were so bad nobody trusted their durability.

  • @Nobody
    link
    English
    137 months ago

    “We decided to bring the issue to Mr. Musk after the 5000th child died in the simulations. He asked if the children were going to be white.”

  • @[email protected]
    link
    fedilink
    107 months ago

    AI employee: “we can’t release the cars, senior muskrat! we just don’t have enough test data to guarantee the algorithm works!”

    Senior Muskrat: “test data, you say…”

    galaxy brain intensifies

  • LittleHermiT
    link
    fedilink
    English
    37 months ago

    The real elephant in the room with AI is that when it works, the network has been over-fitted to the output. And when something completely novel is fed into it, it spits out nonsense that runs over your dog because it looked like a shadow on the asphalt. Poor fido.