Tesla reportedly asked highway safety officials to redact information about whether driver-assistance software was in use during crashes::Elon Musk’s Tesla has faced investigations into Autopilot, including an ongoing NHTSA probe of more than 800,000 Teslas after several crashes.

  • @[email protected]
    link
    fedilink
    English
    681 year ago

    Tesla’s business model is pushing half-baked sloppy software into production and putting lives in danger.

  • @betterdeadthanreddit
    link
    English
    471 year ago

    Since everybody knows that problems just magically disappear when their consequences are hidden.

    • @foggy
      link
      English
      161 year ago

      Especially really smart businessmen!

  • @jeffw
    link
    English
    211 year ago

    The original New Yorker article is an incredible read. I recommend it

      • @jeffw
        link
        English
        41 year ago

        Yes. Ronan Farrow is an excellent journalist

    • @[email protected]
      link
      fedilink
      English
      11 year ago

      Is it? Admittedly, when I was reading it the first time I had to skim a lot because I was in a rush and had to put my phone away.

      I want to get back to it, in fact I will now because it’s my lunch break, but my first thought was that it was kinda tame. Yeah the stuff about the gov not liking their dependence on his services was good, but General Milley sounded like they got along great. It was weird.

  • AutoTL;DRB
    link
    fedilink
    English
    91 year ago

    This is the best summary I could come up with:


    Tesla directed the National Highway Traffic Safety Administration to redact information about whether driver-assistance software was being used by vehicles involved in crashes, The New Yorker reported as part of investigation into Elon Musk’s relationship to the US government.

    "The Vehicle Safety Act explicitly restricts NHTSA’s ability to release what the companies label as confidential information.

    Autopilot, which is meant to help on highways, is the driver-assist software built into all Teslas, while Full Self-Driving is a beta add-on that costs $15,000 a year.

    Full Self-Driving is more advanced, and allows cars to change lanes, recognize stop signs and lights, and park, Tesla says.

    Two months later, the agency announced another investigation into the feature after identifying 11 crashes since 2018 in which Teslas hit vehicles at first-responder sites.

    A Department of Justice criminal investigation has also been underway, with Tesla confirming in February that the DOJ requested documents about the Autopilot and Full Self-Driving features.


    The original article contains 446 words, the summary contains 155 words. Saved 65%. I’m a bot and I’m open source!

  • @Eideen
    link
    English
    -141 year ago

    I don’t see the problem for Tesla. Regardless of whether the autopilot is active or not, the driver is responsible.