Tesla recalls nearly all vehicles sold in US to fix system that monitors drivers using Autopilot::Tesla is recalling nearly all vehicles sold in the U.S., more than 2 million, to update software and fix a defective system when using Autopilot.

  • @givesomefucks
    link
    English
    3811 months ago

    The attempt to address the flaws in Autopilot seemed like a case of too little, too late to Dillon Angulo, who was seriously injured in 2019 crash involving a Tesla that was using the technology along a rural stretch of Florida highway where the software isn’t supposed to be deployed.

    "This technology is not safe, we have to get it off the road,” said Angulo, who is suing Tesla as he recovers from injuries that included brain trauma and broken bones. “The government has to do something about it. We can’t be experimenting like this.”

    This is the important part, it’s not just Tesla not giving a shit about their customers safety, it’s dangerous to literally anyone in the same road as a Tesla

    • @[email protected]
      link
      fedilink
      English
      911 months ago

      This case went to trial already and the jury sided with Tesla because the driver was holding his foot on the accelerator to override cruise control, ignored the vehicles warnings, drove through numerous flashing lights, crashed through a stop sign, and then hit the couple, later stating “I expect to be the driver and be responsible for this… I was highly aware that was still my responsibility to operate the vehicle safely." Any vehicle is capable of doing this with a reckless driver behind the wheel.

    • @[email protected]
      link
      fedilink
      English
      -17
      edit-2
      11 months ago

      using the technology along a rural stretch of Florida highway where the software isn’t supposed to be deployed.

      Idiot drivers do idiot things, hardly unique to Tesla drivers. The whole autopilot feature-set is pretty clearly marked as beta on the screen with a clear warning, that you have to acknowledge to enable, that it is considered a beta-feature and that you should be attentive when using it.

      I agree that the FSD feature-set is advertised with capabilities it in no way possesses. But everyone seems to forget that autopilot and FSD are two separate things. Autopilot is only TACC+Lane-Assist and isn’t advertised as a fully self driving technology.

      • @givesomefucks
        link
        English
        1411 months ago

        But everyone seems to forget that autopilot and FSD are two separate things.

        Because Tesla made it as confusing as possible on purpose and misleads consumers…

        Which is why they’re getting sued about it as I type this…

        • @[email protected]
          link
          fedilink
          English
          -611 months ago

          I’m not sure why people have issues differentiating FSD and Autopilot honestly, they have separate names that are not similar sounding.

          I 100% agree that FSD has been falsely advertised for almost a decade now though.

          • @[email protected]
            link
            fedilink
            English
            -111 months ago

            It’s because they think they’re experts after superficially reading through some news headlines and don’t actually do any research nor use either of these systems to gain the appropriate knowledge. They’re just playing a game of telephone with what they think they know.

      • @[email protected]
        link
        fedilink
        English
        1111 months ago

        The system should nevertheless be designed to handle all types of road and weather conditions. It’s a safety-related system. To not do so, regardless of the reason (probably cost savings) is negligence on the part of Tesla.

        • @[email protected]
          link
          fedilink
          English
          011 months ago

          Autopilot isn’t any more dangerous than any other vehicle sold with cruise control over the past 30 years. I don’t understand why people are so desperate to give reckless drivers a pass rather than making them face consequences for their actions. Is it Honda’s fault if I hold my foot on the gas and drive a 1995 Civic through a red light and T-bone someone?

        • @[email protected]
          link
          fedilink
          English
          -1
          edit-2
          11 months ago

          It handles the same roads and conditions as pretty much all other manufacturers do with their TACC+Lane-Assist solutions. It maintains distance to vehicles ahead, and keeps the car centered in the lane on straight road and gentle curves…nothing more.

          The issue is people using this simple ADAS way outside it’s constraints, and wanting it to be more than it is, and is advertised as. Autopilot is not a self driving solution, and isn’t advertised as it either. It has the same limitations as other ADAS solutions in other cars, but apparently because Tesla calls their solution “Autopilot” people completely disregard warnings and limitations of the system and insist on using it as a self driving solution.

          • @[email protected]
            link
            fedilink
            English
            2
            edit-2
            11 months ago

            Sounds like Tesla should market it as adaptive cruise and lane assist, since clearly their clientel think autopilot means autonomous driving.

            We’re getting to the point where these features might need to be locked behind a specific driving license designation. Drivers need to demonstrate to a proctor that they understand where the systems do and do not work.

            We already have license classifications for commercial equipment and motorcycles, having one for automated features seems fairly justifiable at this point.

      • @AdamEatsAss
        link
        English
        311 months ago

        You would think a road with few drivers would be easier for the autopilot? But maybe the road lacked lines and marking? But wouldn’t you want the car to default to human control not just keep going? Any car I’ve had with lane keep assist turns off of it can’t find the lines. It’s a pretty simple failsafe. Rather have the driver a little annoyed than injured.

          • @AdamEatsAss
            link
            English
            111 months ago

            Exactly. Not sure if there’s any regulation for this but from a controls standpoint you always want to make sure you fail to a “safe” state. I’d the system can’t find any of the input it needs the outcome will be unpredictable.

          • @[email protected]
            link
            fedilink
            English
            111 months ago

            And the driver in the above case was holding his foot on the accelerator to override AP and stated “I expect to be the driver and be responsible for this… I was highly aware that was still my responsibility to operate the vehicle safely."

  • @proudblond
    link
    English
    1411 months ago

    I live in a major metropolitan area, drive a model 3, and almost never use autopilot.

    I am lucky enough to rarely be in stop-and-go traffic, but when I am, I don’t even use cruise control, because it’s too reactive to the car in front of me and subsequently too jerky for my preference.

    As for autopilot, I was on a relatively unpopulated freeway in the second lane from the right, when a small truck came around a clover leaf to merge into the right lane next to me. My car flipped out and slammed on the breaks. The truck wasn’t even coming into my lane; he was just merging. Thankfully there was a large gap behind me, and I was also paying enough attention to immediately jam on the accelerator to counteract it, but it spooked me pretty badly. And this was on a road that it’s designed for.

    Autopilot (much less FSD) can’t really think like our brains can. It can only “see” so far ahead and behind. It can’t look at another driver’s behavior and make assessments that they might be distracted or or drunk. We’re not there yet. We’re FAR from there.

    • ∟⊔⊤∦∣≶
      link
      fedilink
      English
      611 months ago

      I don’t think AI will ever be able to account for context; like ‘black friday sales are on and the general trend on the road is a lot of fuckery, so I’ll drive extra extra carefully today’

      • @poopkins
        link
        English
        811 months ago

        Or things like, that idiot’s mattress is going to fly off his roof any second, I’m not driving behind him.

  • @ieightpi
    link
    English
    311 months ago

    Why can’t we just regulate autopilot to interstate systems for the time being? There’s no reason a person needs to be attentive for miles on end with no stops. And if the tech isnt there for complicated traffic patterns in suburban and urban environments, make it inaccessible unless you are on the interstate. Seems like an easy enough fix for the time being.

  • AutoTL;DRB
    link
    fedilink
    English
    211 months ago

    This is the best summary I could come up with:


    DETROIT (AP) — Tesla is recalling nearly all vehicles sold in the U.S., more than 2 million, to update software and fix a defective system that’s supposed to ensure drivers are paying attention when using Autopilot.

    The recall comes after a two-year investigation by the National Highway Traffic Safety Administration into a series of crashes that happened while the Autopilot partially automated driving system was in use.

    But safety experts said that, while the recall is a good step, it still makes the driver responsible and doesn’t fix the underlying problem that Tesla’s automated systems have with spotting and stopping for obstacles in their path.

    The attempt to address the flaws in Autopilot seemed like a case of too little, too late to Dillon Angulo, who was seriously injured in 2019 crash involving a Tesla that was using the technology along a rural stretch of Florida highway where the software isn’t supposed to be deployed.

    Philip Koopman, a professor of electrical and computer engineering at Carnegie Mellon University who studies autonomous vehicle safety, called the software update a compromise that doesn’t address a lack of night vision cameras to watch drivers’ eyes, as well as Teslas failing to spot and stop for obstacles.

    In its statement Wednesday, NHTSA said the investigation remains open “as we monitor the efficacy of Tesla’s remedies and continue to work with the automaker to ensure the highest level of safety.”


    The original article contains 1,030 words, the summary contains 235 words. Saved 77%. I’m a bot and I’m open source!

  • @arin
    link
    English
    111 months ago

    Recall for software? What happened to over the air updates?

    • @[email protected]
      link
      fedilink
      English
      5
      edit-2
      11 months ago

      Recalls do include OTA updates. It doesn’t necessarily mean you need to bring your car to a dealership or shop.