‘Boycott Tesla’ ads to air during Super Bowl — “Tesla dances away from liability in Autopilot crashes by pointing to a note buried deep in the owner’s manual, that says Autopilot is only safe on fr…::undefined

  • @Geyser
    link
    fedilink
    English
    5016 days ago

    Take Tesla for whatever you will, but there’s crazy conflict of interest behind the dawn project/Dan o’dowd attacking this.

    “Green Hills also develops automotive software—it’s about 40 percent of the company’s business, O’Dowd said—and is a software supplier for the 2022 BMW iX EV crossover. This has caused O’Dowd critics and Tesla fans to call out The Dawn Project’s conflict of interest and question the organization’s motives.”

    https://www.motortrend.com/features/tesla-full-self-driving-ban-attempt-elon-musk-dan-odowd/

    • @cm0002
      link
      fedilink
      English
      2416 days ago

      Def conflict of interest, but they aren’t necessarily lying, I’ve personally lost all trust in Teslas autopilot the day Musk stated (paraphrasing) that “aLl wE NeED arE CAmEraS” like no thanks, a safety critical system should have a fall back system. Cameras fuck up a lot and LiDAR came down a lot in cost since then, but anybody who actually even halfway knows what they’re doing could have predicted that.

      And then all the things Musk’s done since just solidified that lol

      I am so ready to get a fully autonomous car, but not a fucking Tesla that’s for damn sure, I’ll wait for some other manufacturer.

      • @[email protected]
        link
        fedilink
        English
        -34
        edit-2
        16 days ago

        It works surprisingly well. You should try it before belting out under-educated (trying to say this as kindly as I can here!) comments.

        When it’s rainy and such, it’ll warn you, tell you it’s a dumb idea, then limit its speed. And really, with any driver assist program, the driver should still be paying attention.

        • @TheMinions
          link
          fedilink
          English
          1216 days ago

          I have used both. Even under ideal conditions the cameras only is noticeably worse.

          I recall parking in a parking lot and the camera systems showed that I was hitting a car while I was about a foot away from it. Lidar did not have any issues like that. And it was a perfectly sunny day.

          • @[email protected]
            link
            fedilink
            English
            -516 days ago

            So Tesla hasn’t used lidar, and have disabled the front-radar for a while now. Though lately they decided to drop ultrasonic sensors and I think it’s crap for parking right now. I suspect the cameras were not laid out to handle this use.

            If you have a hard requirement on tight parking, I’d strongly recommend looking elsewhere or trying it with a trial first…

        • @TheGrandNagus
          link
          fedilink
          English
          1116 days ago

          They aren’t wrong. Camera systems do have a lot of issues. Calling someone stupid for pointing that out is absurd.

          Cameras can mess up when they’re dirty, when it’s raining, when it’s dark, when there’s glare.

          Image processing and recognition is also a hell of a lot slower and more computationally expensive than numbers from a LIDAR system.

          • @[email protected]
            link
            fedilink
            English
            -14
            edit-2
            16 days ago

            Didn’t intend to call them stupid. My intent was to say they should try out a product that uses the tech before passing judgement on the effectiveness of it. I was hoping the use of “under” would be taken a lot lighter.

            Though, self-driving doesn’t have to be a 100% scenario item either. It doesn’t have to have the ability to activate all the time. It’s cloudy/sunny 90%+ of the days where I’m at.

            We would get nowhere if the criteria was “must be 100% useful in 100% of scenarios 100% of the time”.

            • @TheGrandNagus
              link
              fedilink
              English
              1016 days ago

              Everybody is aware it’s not going to be flawless out of the gate, or perhaps ever. That’s not the point.

              They’re saying using cameras and only cameras is a bad idea and makes “Autopilot” inherently subpar in situations where cameras don’t work so well.

              • @[email protected]
                link
                fedilink
                English
                -4
                edit-2
                16 days ago

                Sure, but having actually driven one instead of commenting on the internet, the issues that have come up with AP/FSD have seem to be more data related (speed limits causing slowdowns).

                When there are vision issues the car complains and it’s up to you as the driver to take control. I don’t care how many sensors are on/not on the car, if it tells you it can’t handle the situation safely, that’s now on you as the driver.

                Plus, vision-only drivers assist isn’t new and Tesla isn’t the first on the road with it. I’ll let you figure out which particularly popular Japanese brand known for their crossovers 😉

                • @[email protected]
                  link
                  fedilink
                  English
                  1016 days ago

                  Never go full fanboy… try actually understanding what people are telling you, why fail safes are good idea, why redundancy systems are implemented, etc.

                  Your anecdotal experience of “I drive in fog and rain and its been good” means nothing…

                • @TheGrandNagus
                  link
                  fedilink
                  English
                  216 days ago

                  When there are vision issues the car complains

                  Except for when it doesn’t and the car plows straight into the back of a lorry. Or runs over a teenager getting off a school bus. Etc.

        • @batmaniam
          link
          fedilink
          English
          1116 days ago

          If I still have to pay attention, what my Hyundai has is fine. I think that’s the biggest issue with any FSD. Theres a sharp diminishing returns past a point. If I can’t take a nap, I don’t care. And I’ll trust any of them the day there are laws on the books imposing a $7.5MM fine per casualty.

          Those cars make the manufacturer money while hoisting the risk onto the public. People talk about stats compared to human drivers, it’s not about stats, it’s about accountability.

          • @[email protected]
            link
            fedilink
            English
            -116 days ago

            That’s fair and is a reasonable line of “usefulness”. Of driver assist tech, good lane centering + adaptive cruise control are the biggest, most useful features that I think everyone would see big benefits from, and there’s been a lot of improvement in this area over the years.

            Tesla’s AP+FSD has been incredibly rock solid on that front, to the point where most roads feel like you’re on a rail track.

        • @cm0002
          link
          fedilink
          English
          6
          edit-2
          15 days ago

          First of all, Mr. Fanboy, Musk has been touting it as a Full Self Driving program for a long time, they’ve somewhat stopped that recently after regulators started looking into it and you cite their website like they wouldn’t change it lmao.

          You’re even commenting in a thread about an article on their deceptive practices with it.

          Second, it’s quite clear from my comment that I’m not interested in a “Driver Assist Program” I’m interested in a Fully Autonomous vehicle. “Driver Assist” is a nice feature, but not one I’m going to buy a whole new car for. When Tesla eventually (if ever) comes out with their Full Full Self Driving, however, I won’t get it because they’ve lost my trust in it from all these shenanigans they’re pulling now.

          Third, there’s absolutely a need in a life and death safety critical autonomous system to have another “sight” system. There’s not even a cost excuse anymore and disabling the existing systems is asinine and potentially dangerous. Musk is just incapable of admitting when he’s wrong and revising his plan like an adult.

          Camera only systems get things wrong all the time, they have gaps and scenarios where they just don’t work all that well. LiDAR does too, but in different situations, so their strength comes when melded together to complement each other.

          Stop licking Elons boot and get a mind for yourself.

          • @[email protected]
            link
            fedilink
            English
            -9
            edit-2
            15 days ago

            I take it that writing a thought-out civil comment was out of the question.

            I’d be open to a conversation should this be rephrased to something more productive.

    • @[email protected]
      link
      fedilink
      English
      1515 days ago

      Yeah no way shitting on a company could be worth the price of a super bowl ad unless also peddling something yourself.

      • @HeyThisIsntTheYMCA
        link
        fedilink
        English
        215 days ago

        I dunno if I had a billion dollars I’d consider a screw you white castle ad just for shits and giggles (mostly shits)

  • AutoTL;DRB
    link
    fedilink
    English
    1116 days ago

    This is the best summary I could come up with:


    A California tech entrepreneur is paying more than half a million dollars for Super Bowl ads criticizing Tesla for not disabling its Autopilot technology outside the conditions for which it was designed, a problem highlighted by a Washington Post investigation this past fall and later cited in a recall of virtually every U.S. Tesla equipped with Autopilot, around 2 million vehicles.

    In one, a 17-year-old was severely injured when a Tesla struck him at 45 mph as he disembarked a school bus in North Carolina that had its stop sign out and warning lights flashing.

    The ad makes reference to prior Dawn Project videos depicting the alleged failure of Teslas to react to child-size mannequins in the road — including last year’s Super Bowl commercial, which aired weeks before the North Carolina crash.

    The other ad set to air during this year’s game shows the crash that killed a 50-year-old father in 2019 when his Tesla drove under a semi-truck trailer and the moment a Tesla blew through a stop sign and blinking lights on a rural Florida road as it barreled toward a parked vehicle and flung a young couple into the air, killing one of them and leaving the other severely injured — footage first published by The Post.

    The company is facing concerns over stagnating revenue, mounting worries about its capacity to deliver long-promised “Full Self-Driving” technology, and Wall Street hand-wringing over the persistent distraction of its mercurial CEO.

    Musk has asked for a larger stake in the company as a condition for “growing Tesla to be a leader in AI & robotics,” saying that without 25 percent control he “would prefer to build products outside of Tesla.” But some investors have not given the idea a warm reception.


    The original article contains 770 words, the summary contains 292 words. Saved 62%. I’m a bot and I’m open source!

    • @[email protected]
      link
      fedilink
      English
      1016 days ago

      Here’s the whole article:

      A California tech entrepreneur is paying more than half a million dollars for Super Bowl ads criticizing Tesla for not disabling its Autopilot technology outside the conditions for which it was designed, a problem highlighted by a Washington Post investigation this past fall and later cited in a recall of virtually every U.S. Tesla equipped with Autopilot, around 2 million vehicles. It’s the second consecutive year Tesla critic Dan O’Dowd has run an ad campaign on television’s biggest night. He leads the Dawn Project, a group that has sought a ban on Tesla’s driver-assistance technology. The latest campaign is unequivocal: “Boycott Tesla,” it says, following footage of deadly and severe crashes involving its vehicles. One ad features footage of Teslas running over child-size mannequins, depictions that have previously led Tesla to issue a cease-and-desist letter. O’Dowd said he was compelled to bring awareness to the latest issue with what he calls “the most incompetent software I’ve ever seen” in part by The Post’s investigation. O’Dowd founded Green Hills Software, which makes operating systems for cars and airplanes. “What possible reason is there that they don’t disable Autopilot on roads that they say are not safe?” he asked of Tesla.

      Tesla and its chief executive, Elon Musk, did not respond to a request for comment. Musk’s followers have accused O’Dowd of a conflict of interest because one of Green Hills Software’s customers is Mobileye, which develops driver-assistance software. O’Dowd says his motivation stems purely from concerns over Tesla’s tech. Tesla maintains that its software is intended to be used by a fully attentive driver and argues that it is “morally indefensible not to make these systems available to a wider set of consumers,” citing figures it says show a lower prevalence of crashes when its software is activated. “The people would have it banned if they only knew the truth, if they only understood what it would do,” O’Dowd said. “Well, that’s our job. The politicians aren’t going to move until the public moves.”

      The two ads highlight three significant crashes alleged to have involved Autopilot. In one, a 17-year-old was severely injured when a Tesla struck him at 45 mph as he disembarked a school bus in North Carolina that had its stop sign out and warning lights flashing. “Still Tesla does nothing,” the ad concludes. “Boycott Tesla to keep your kids safe.” The ad makes reference to prior Dawn Project videos depicting the alleged failure of Teslas to react to child-size mannequins in the road — including last year’s Super Bowl commercial, which aired weeks before the North Carolina crash.

      The other ad set to air during this year’s game shows the crash that killed a 50-year-old father in 2019 when his Tesla drove under a semi-truck trailer and the moment a Tesla blew through a stop sign and blinking lights on a rural Florida road as it barreled toward a parked vehicle and flung a young couple into the air, killing one of them and leaving the other severely injured — footage first published by The Post. In both cases, Autopilot was operating in locations where it was not intended to be used. “Tesla dances away from liability in Autopilot crashes by pointing to a note buried deep in the owner’s manual that says Autopilot is only safe on freeways,” the commercial opens, pointing to federal pleas to restrict it. “Shockingly, Tesla refused,” the commercial continues, leading into footage of the semi-truck crash and the crash involving the young couple. “This caused many tragic accidents when Autopilot was enabled on roads where Tesla knows it isn’t safe. Tesla must be held accountable. Boycott Tesla to keep your family safe.”

      O’Dowd’s group said the ads are airing in D.C., California, Delaware and Michigan.

      Musk took last year’s Super Bowl attention in stride. “This will greatly increase public awareness that a Tesla can drive itself (supervised for now)” he tweeted about last year’s ad. The company is facing concerns over stagnating revenue, mounting worries about its capacity to deliver long-promised “Full Self-Driving” technology, and Wall Street hand-wringing over the persistent distraction of its mercurial CEO. It has shed billions in value, down around 15 percent just in the past month.

      Musk has asked for a larger stake in the company as a condition for “growing Tesla to be a leader in AI & robotics,” saying that without 25 percent control he “would prefer to build products outside of Tesla.” But some investors have not given the idea a warm reception. In January, a Delaware judge ruled that an unprecedented $56 billion pay package awarded to him in 2018 was unfair.

      FUCK YOU, JEFF BEZOS AND THE WASHINGTON POST

  • @[email protected]
    link
    fedilink
    English
    515 days ago

    Wasn’t there an anti-privacy ad about Apple products being used to distribute CSAM as well? Rebecca Watson did a recent counterpoint to the ad. Privacy invasive tech is not good when large social movements are seeking to purge undesirables within the public

  • @darganon
    link
    fedilink
    English
    -1516 days ago

    I have a Tesla, they make it extremely clear that you are never supposed to touch autopilot under any circumstances, and you have to go way out of your way to prevent it from disengaging while you’re not paying attention.

    • Ghostalmedia
      link
      fedilink
      English
      2716 days ago

      When I was looking to buy a new car back in early 2019, I walked into a showroom for a final test drive before I threw some money down for a Model 3.

      It started to rain pretty hard on the return drive back. When executing an auto lane change, the sensors freaked out because of the water interference and they violently yanked the car back into the origin lane halfway through the lane change. It hydroplaned a hair and scared this shit out of my wife and I. The Telsa employee assured us “it’s ok, this is normal.” Hearing that was normal was not comforting.

      Upon returning to the showroom, a different model 3 in the parking lot started backing toward a small child. My wife saw what was happening, threw herself in front of the car, and that caused it to halt.

      I’m sure the software has progressed in the past 5 years, but suffice to say, we changed our minds on the car at that time. Those two incidents within 15 minutes really made us question how that shit was legal.

      • @Draupnir
        link
        fedilink
        English
        7
        edit-2
        16 days ago

        If the car was backing out, that was a human driver in control, not autopilot. Autopilot can only be enabled while driving on a well-marked roadway. The first part is plausible however. Likely the software at the time could not handle rain appropriately and you are absolutely right to question this if they tell you it was normal.

        • Ghostalmedia
          link
          fedilink
          English
          415 days ago

          The car was being summoned from a parking space. Summon / Smart Summon will absolutely back out of a space fully autonomously.

      • @[email protected]
        link
        fedilink
        English
        5
        edit-2
        16 days ago

        That’s the thing, it’s only legal in the US (as far as I know, at least). In Germany you’re only allowed to use self-driving if your hands are on the steering wheel at all times and you can take over if something goes wrong.

        • @[email protected]
          link
          fedilink
          English
          216 days ago

          I’m pretty sure that is also the case in the US. These incidents are either caused by some sort of defeat device (I have seen weights that wrap around the steering wheel, no idea if they work), or people who have just gotten good at resting a hand on the wheel and not paying attention I think

        • @eltrain123
          link
          fedilink
          English
          216 days ago

          That’s the case in the US, too. The car automatically shuts off autopilot after 3 warnings of not keeping your hands on the steering wheel. It produces a loud audible alert after a few seconds if it senses the driver isn’t keeping their hands on the wheel. After the 3rd time, it continues the audible alert until the driver takes back control.

          There are also several warnings about keeping your hands on the wheel and staying alert when engaging autopilot.

          The people saying otherwise are either ignorant or disingenuous.

      • @[email protected]
        link
        fedilink
        English
        316 days ago

        These instances of errors are obviously alarming, but all the evidence we have is that they’re still safer than human drivers. They will make mistakes - and sometimes those mistakes will cost lives - but they will make fewer mistakes than humans. Given this, as visceral as it feels when we hear of these stories, I think our ire is misplaced. Automated driving will never be perfect. If that’s the bar we’re aiming for we should just give up and go home. The goal is better than humans, and in many conditions, it’s already there.

    • @Draupnir
      link
      fedilink
      English
      416 days ago

      I’m sorry, never supposed to touch autopilot? Under any circumstances? Other than that, yeah, 100% if it detects you are not paying attention via wheel nag or eye-tracking camera at all it will alarm and disengage, potentially banning the user for a short time from using it. They do make it very clear of this result.

    • @asdfasdfasdf
      link
      fedilink
      English
      316 days ago

      This is the opposite of reality. They make it extremely clear that it’s beta and you need to always keep your eyes on the road in case you need to take over, and that you should take over if you feel you need to.

    • @Bell
      link
      fedilink
      English
      -1316 days ago

      Of course they do but every one here is so filled with hate for Elon they cannot consider the car might be reasonable

      • @[email protected]
        link
        fedilink
        English
        -4
        edit-2
        16 days ago

        Their loss, heh.

        Having to barely put in work to drive is incredible. More time to keep eyes on everything around me in all directions. Road-trips aren’t exhausting, bumper-to-bumper traffic is a breeze.

        For a technology forum, it is incredibly disappointing to see how closed minded people are to the tech.