• MisterMoo
    link
    fedilink
    9510 months ago

    Imagine naming a feature “Full Self-Driving,” and yet you can’t take your attention away from the road and must be ready to take over at a moment’s notice.

    • @[email protected]
      link
      fedilink
      English
      5710 months ago

      This is on par for Elon’s entire career. He loves claiming success and taking credit for things he either didn’t accomplish himself, or things he hasn’t accomplished yet.

        • @[email protected]
          link
          fedilink
          English
          -1210 months ago

          Like popularize electric vehicles, create a reusable rocket or put global internet around the earth? Never gonna happen right?

          • @[email protected]
            link
            fedilink
            English
            1210 months ago

            He didn’t do any of those things. He hired people with functioning brains to do it while impeding them by constantly violating labor laws and creating organizational chaos.

          • Square Singer
            link
            fedilink
            English
            310 months ago

            Nah, more like the replicator, teleportation and faster than light vehicles.

            Listing random non-related things isn’t hard.

          • @[email protected]
            link
            fedilink
            English
            310 months ago

            The only thing on that list I think his companies can really inarguably take credit for is the reusable rockets, the rest were already being done by other companies. He got ahead of the competition (mostly by ignoring regulations, labor laws, etc), but electric cars and satellite internet constellations were very far from novel concepts when he entered the space.

    • athos77
      cake
      link
      fedilink
      1410 months ago

      I remember reading a post that claimed that Tesla’s safety rating was given to them because a bunch of their crashes were determined to be human error - because the self-driving feature would automatically disconnect if it faced a crash it couldn’t avoid.

      • @MowFord
        link
        English
        1210 months ago

        Fairly certain the statistic requires fsd to have been disabled for 10s before or is counted as human-caused

      • @T156
        cake
        link
        English
        610 months ago

        The issue is a bit muddied by the fact that hitting the brake or the accelerator will deactivate it, and people will usually hit one of those if they believe that they are going to crash.

    • @[email protected]
      link
      fedilink
      English
      1210 months ago

      It’s ok, it’s in beta, so some features may not be complete just yet, but hey, let’s just release this to the public anyways.

    • @UsernameIsTooLon
      link
      English
      610 months ago

      I feel like even with fully autonomous cars, there’s going to be laws about how the main driver should always be alerted. This would be the case unless our cars are their own independent drivers like a cab.

      • @[email protected]
        link
        fedilink
        English
        1110 months ago

        Honestly, there should be laws against full self driving modes unless they can be proven to be good enough to not require driver intervention at all, and the manufacturer can be legally considered as the driver in case of an incident.

        Requiring a driver to be alert and attentive to the road while not doing anything to operate the car runs contrary to human psychology. People cannot be expected to maintain focus on the road for extended periods while the car drives itself.

        I don’t know exactly where the line should be drawn between basic cruise control and full self driving, but either the driver should be kept actively involved in driving or the car manufacturer should be held liable for whatever the car does.

    • @helmet91
      link
      English
      510 months ago

      It’s just a driving assistant, like in any other car. As far as I know, currently Mercedes is the only one who implemented autonomous driving, and even that one is limited to some specific areas. But at least that one is real. So much, that legally Mercedes (the company) is considered to be the driver of such cars, in case anything happens on the roads.

      • @practisevoodoo
        link
        English
        510 months ago

        Depends on your definition for autonomous driving which mainly depends on your ODD but they’re not the only ones. Honda ,Volvo and GM have something. Others (i.e. BMW) have stuff next year but they’re all going with more accurate names. CoPilot, PilotAssist, Super cruise, Traffic Jam Pilot. Makes it clear that these are drive assists, not drive replace.

        • @froh42
          link
          English
          210 months ago

          Mercedes has Level 3 autonomy in certain highway situations, so you are legally allowed to watch a video or read a newspaper. You just need to be able to take over again within 20 seconds or so.

          Others are following close up, I think Audi had to postpone Level 3 a bit etc. BMW has something in the pipeline as well.

          But these are really more than drive assists. I really find the “Level n” specifications more helpful than “drive assist” vs “autonomy”

          None of the other brands oversell what they are offering.

    • @asdfasdfasdf
      link
      English
      310 months ago

      Are there any truly autonomous machines which don’t require a human to monitor?

      • @[email protected]
        link
        fedilink
        English
        1210 months ago

        Lots. Toasters, refrigerators, robot vacuums, thermostats, smart home lights, etc.

        The reason why self-driving cars are extra tricky is both because they have a much more complex task and the negative consequences are sky high. If a robot vacuum screws up, it’s not a big deal. This is why it’s totally irresponsible to advertise something as having “full” autonomy when the stakes are so high.

        • @asdfasdfasdf
          link
          English
          310 months ago

          Those are mostly automatic, not autonomous.

        • @[email protected]
          link
          fedilink
          English
          010 months ago

          It would not be such a problem, if there wouldn’t be that many monkey driving around and if cars would talk to each other.

      • @[email protected]
        link
        fedilink
        English
        -1
        edit-2
        10 months ago

        Yeah, got a small delivery car in my country that drives the streets fully autonomous. It is used to deliver groceries to a distribution point.

        It was kind of hallucinating to see it drive past. Since the car has a sort of cockpit, but it is too narrow to seat any human.

        It is currently limited to 25 kph, and someone supervises it remotely at all times and can intervene. Just to be on the safe side. Although that rarely happens.

        The main reason it can do this is because it always drives the same route.

        https://press.colruytgroup.com/collectgo-tests-unmanned-vehicle-in-londerzeel#

    • @[email protected]
      link
      fedilink
      English
      010 months ago

      You’re absolutely right, it can be quite misleading to name a feature “Full Self-Driving” when it still requires constant attention and intervention from the driver. The expectations set by such a name may not align with the reality of the technology’s current limitations.

    • @[email protected]
      link
      fedilink
      English
      -510 months ago

      Let’s be fair. It could be called Driver Assistant Plus and you people would still be complaining because this isn’t about Tesla

      • @JdW
        cake
        link
        English
        110 months ago

        Let’s be fair. Elon could be killing a man, on camera, and shout a confession afterwards and you would still find excuses for his behaviour and tell us we’re just misinterpreting facts…

        • @[email protected]
          link
          fedilink
          English
          2
          edit-2
          10 months ago

          There’s plenty to criticize Musk for. Your apparent assumption that anyone not violently agains him must be a fan is just further evidence for the lack of ability to think rationally about the subject.

          • @[email protected]
            link
            fedilink
            English
            110 months ago

            Nah, more the assumption that anyone not violently against him doesn’t know him or his actions.

            • @[email protected]
              link
              fedilink
              English
              -210 months ago

              If you violently oppose Musk then how do you feel about people like Putin, Xi Jinping, Ali Khamenei or Bashar al-Assad? Those are the people I focus my opposition to. Someone like Musk, Bezos, Tate etc. I simply have no time nor interest for. That’s recreational outrage.

      • Dr. Dabbles
        link
        English
        110 months ago

        I complained because it absolutely sucked. Only Tesla would release this garbage in such a fraudulent manner, no other company would risk the lawsuits. Tesla’s been killing people with autopilot since 2016, and FSD since it was released to the public. That should make you think, but that seems to be hard for some people when it comes to a Musking,

        • @[email protected]
          link
          fedilink
          English
          0
          edit-2
          10 months ago

          Autopilot or FSD Beta has never been something you’re supposed to rely on and every Tesla owner knows this. If they drive over someone it’s the fault of the driver, not the vehicle. Accidents will always happen and if you focus on individual incidents you’re missing the big picture. You’re never going to reach 100% safety and 99.99% safety means 33000 accidents a year in the US alone. Also the little statistics we have about this indicate that drivers with FSD or Autopilot engaged already crash less than the average.

          According to this report, the average Tesla equipped with FSD Beta, driven on predominantly non-highway sections of road, crashes 0.31 times per million miles, a dramatic decrease from the average American, who crashes 1.53 times every million miles.

          Source

          • Dr. Dabbles
            link
            English
            110 months ago

            Too bad there’s so many owners relying on it.

            if you focus on individual incidents you’re missing the big picture

            Not at all. In fact, the point is to focus on classes of crashes. Which Tesla fails miserably at.

            Also the little statistics we have about this indicate that drivers with FSD or Autopilot engaged already crash less than the average.

            This is an outright lie. Period. Having owned a Tesla since 2018, I’m quite familiar with the garbage software and the user community that loves to say no one should trust it on one side, and on the other side of their face says that it’s better than a human.

            • @[email protected]
              link
              fedilink
              English
              010 months ago

              Literally can’t debate with you guys because you straight out refuse to believe any evidence presented to you and just base your opinions on anecdotal evidence and individual incidents. If those stats are made up then provide a better source that backs you up.

              • Dr. Dabbles
                link
                English
                110 months ago

                You seem to be confusing evidence with marketing. That’s your problem. If you actually owned one, you’d know I was right.

    • @[email protected]
      link
      fedilink
      English
      -610 months ago

      You’re absolutely right, it can be quite misleading to name a feature “Full Self-Driving” when it still requires the driver’s constant attention and readiness to take control. The expectation that the vehicle can handle all driving tasks autonomously is not aligned with the current reality. It’s important for automakers to be transparent and accurate in their naming conventions to avoid any false expectations.

    • @[email protected]
      link
      fedilink
      English
      6610 months ago

      I wish this was talked about every single time the subject came up.

      Responsible, technologically progressive companies have been developing excellent, safe, self-driving car technology for decades now.

      Elon Musk is eviscerating the reputation of automated vehicles with his idiocy and arrogance. They don’t all suck, but Tesla sure sucks.

    • @IphtashuFitz
      link
      English
      23
      edit-2
      10 months ago

      Even with LIDAR there are just too many edge cases for me to ever trust a self driving car that uses current-day computing technology. Just a few situations I’ve been in that I think a FSD system would have trouble with:

      • I pulled up at a red light where a construction crew was working on the side of the road. They had a police detail with them. As I was was watching the red light the cop walked up to my passenger side and yelled “Go!” at me. Since I was looking at the light I didn’t see him trying to wave me through the intersection. How would a car know to drive through a red light if a cop was there telling you to?

      • I’ve seen cars drive the wrong way down a one way street because the far end was blocked due to construction and backtracking was the only way out. (Residents were told to drive out the wrong way) Would a self driving car just drive down to the construction site and wait for hours for them to finish?

      • I’ve seen more than one GPS want to route cars improperly. In some cases it thinks a practically impassible dirt track is a paved road. In other cases I’ve seen chains and concrete barriers block intersections that cities/towns have determined traffic shouldn’t be going through.

      • Temporary detour or road closure signs?

      • We are having record amounts of rain where I live and we’ve seen roads covered by significant flooding that makes them unsafe to drive on. Often there aren’t any warning signs or barricades for a day or so after the rain stops. Would an FSD car recognize a flooded out road and turn around, or drive into the water at full speed?

      • @[email protected]
        link
        fedilink
        English
        1210 months ago

        In my opinion, FSD isn’t attempting to solve any of those problems. Those will require human intervention for the foreseeable future.

        • @IphtashuFitz
          link
          English
          710 months ago

          Musk’s vision is (was?) to eventually turn Tesla’s into driverless robo-taxis. At one point he even said he could see regular Tesla owners letting their cars drive around like automated Ubers, making money for them, instead of sitting idle in garages.

        • Dr. Dabbles
          link
          English
          110 months ago

          Well FSD is supposed to be Level 5 according to the marketing and description when it went on sale. Of course, we know Tesla’s lawyers told California that they have nothing more than Level 2, have not timeline to begin building anything beyond Level 2, and that the entire house of cards hinges on courts and regulators continuing to turn a blind eye.

        • @[email protected]
          link
          fedilink
          English
          110 months ago

          Or there are better other ways to tell a FSD car that the road is closed. We could use QR code or something like that which includes info about blockade, where you can drive around it, and how long it will stay blocked. A FSD should be connected enough to call home and give info to the servers, those then update the other FSD cars, et voila tadaa.

          • Flying Squid
            link
            English
            210 months ago

            Sure. A QR code. That couldn’t possibly get obscured.

            • @[email protected]
              link
              fedilink
              English
              110 months ago

              Why should it get obscured? Just plug that giant QR code there. You can create QR codes, where you only need to see very little of the square to get all the info in the QR code. I don’t feel like obscuring would be any problem.

              • @froh42
                link
                English
                310 months ago

                Wow, you solved one of the really easy self driving problems (sign recognizion) with a more complicated solution.

                Sign recognition and traffic light recognition are available in a lot of cars. Detecting that the dark lump on the road is just a shadow and not slamming the brakes automatically is the hard part.

                Or that the white sky is in fact not a white sky but the sideways view of a semi trailer.

                (The latter issue is why relying on multiple sensors camera+radar+ultrasound in the case of my car’s emergency brake system and drive assistant is always a lot better - each sensor on its own has its failure modes)

                • @[email protected]
                  link
                  fedilink
                  English
                  110 months ago

                  Umm, we had it about blocked streets not normal sign detection But if those would be standardized to designs good “readable” for cam/pc (like QR codes), the car would never recognize e.g. a “S” for a “5”. Due to high contrast, one could assure that it still works by night. But honestly I would prefer, if manually cars are completely banned from roads and only FSD cars are allowed. The streets would then be made for FSD cars (instead for human(“monkeys” drivers) and about all problems are solved.

              • Flying Squid
                link
                English
                110 months ago

                Mud for one. Trees and bushes for another. Strong wind for a third. All of those things already obscure signs or make them very hard for humans to read, let alone a computer.

    • @CarlosCheddar
      link
      English
      1510 months ago

      Just like that cheaper non-lidar Roomba with room mapping technology, it will get lost.

    • Eager Eagle
      link
      English
      110 months ago

      I don’t know why people are so quick to defend the need of LIDAR when it’s clear the challenges in self driving are not with data acquisition.

      Sure, there are a few corner cases that it would perform better than visual cameras, but a new array of sensors won’t solve self driving. Similarly, the lack of LIDAR does not forbid self driving, otherwise we wouldn’t be able to drive either.

      • ShadowRam
        link
        fedilink
        5
        edit-2
        10 months ago

        challenges in self driving are not with data acquisition.

        What?!?! Of course it is.

        We can already run all this shit through a simulator and it works great, but that’s because the computer knows the exact position, orientation, velocity of every object in a scene.

        In the real world, the underlying problem is the computer doesn’t know what’s around it, and what those things around doing or going to do.

        It’s 100% a data acquisition problem.

        Source? I do autonomous vehicle control for a living. In environments much more complicated than a paved road with accepted set rules.

        • Eager Eagle
          link
          English
          010 months ago

          You’re confusing data acquisition with interpretation. A LIDAR won’t label the data for your AD system and won’t add much to an existing array of visible spectrum cameras.

          You say the underlying problem is that the computer doesn’t know what’s around it. But its surroundings are reliably captured by functional sensors. Therefore it’s not a matter of acquisition, but processing of the data.

          • ShadowRam
            link
            fedilink
            2
            edit-2
            10 months ago

            won’t add much to an existing array of visible spectrum cameras.

            You do realize LIDAR is just a camera, but has an accurate distance per pixel right?

            It absolutely adds everything.

            But its surroundings are reliably captured by functional sensors

            No it’s not. That’s the point. LIDAR is the functional sensor required.

            You can not rely on stereoscopic camera’s.
            The resolution of distance is not there.
            It’s not there for humans.
            It’s not there for the simple reason of physics.

            Unless you spread those camera’s out to a width that’s impractical, and even then it STILL wouldn’t be as accurate as LIDAR.

            You are more then welcome to try it yourself.
            You can be even as stupid as Elon and dump money and rep into thinking that it’s easier or cheaper without LIDAR.

            It doesn’t work, and it’ll never work as good as a LIDAR system.
            Stereoscopic Camera’s will always be more expensive than LIDAR from a computational standpoint.

            AI will do a hell of a lot better recognizing things via a LIDAR Camera than a Stereoscopic Camera.

            • Eager Eagle
              link
              English
              210 months ago

              This assumes depth information is required for self driving, I think this is where we disagree. Tesla is able to reconstruct its surroundings from visual data only. In biology, most animals don’t have explicit depth information and are still able to navigate in their environments. Requiring LIDAR is a crutch.

              • @Geek_King
                cake
                link
                English
                210 months ago

                I disagree with you, I don’t think visual camera’s alone are up to the task. There was an instance of a Tesla in auto pilot mode driving at night with the driver being drunk. This took place in Texas on the high way, the car’s camera footage was released and it showed the autopilot not identify the police car in the lane with it’s red/blue lights flashing as a stationary obstacle. Instead it didn’t realize there was a car in the way around 1 second before the 55 mph impact, and it turned of autopilot that 1 second before.

                Having multiple layers of sensors, some being good at actually sensing a stationary obstacle, plus accurate range finding, plus visual analysis to pick out people and animal, thats the way to go.

                Visual range only cameras were just reported to have a harder time recognizing people of color and children.

                • Eager Eagle
                  link
                  English
                  110 months ago

                  the car’s camera footage was released and it showed the autopilot not identify the police car in the lane with it’s red/blue lights flashing

                  If the obstacle was visible in the footage, the incident could have been avoided with visible spectrum cameras alone. Once again, a problem with the data processing, not acquisition.

      • Dr. Dabbles
        link
        English
        110 months ago

        Yes, self driving is not computationally solved at all. But the reason people defend LIDAR is that visible light cameras are very bad at depth estimation. Even with paralax, a lot of software has a very hard time accurately calculating distance and motion.

    • @[email protected]
      link
      fedilink
      English
      -310 months ago

      Dont let them know about that I don’t want my radar detector flipping out for laser lol

        • @[email protected]
          link
          fedilink
          English
          -2
          edit-2
          10 months ago

          K and KA band are used for blind spot monitoring and would make radar detectors go nuts until filtering got worked out, cars that use Lidar will set them off as well though they’re more rare still

          • ShadowRam
            link
            fedilink
            210 months ago

            What?

            What does Radar have anything to do with Lidar?

            They are completely in different EM spectrums.

            Also modern LIDAR is keyed in a way that LIDAR systems can’t interfere with other LIDAR systems.

            • @[email protected]
              link
              fedilink
              English
              -210 months ago

              Maybe because I dunno, it’s a detector? I’d love to try to explain it further but it seems like you’re being intentionally oblivious so why bother lol

    • @[email protected]
      link
      fedilink
      English
      -12
      edit-2
      10 months ago

      Do you have lidar on your head? No, yet you’re able to drive with just two cameras on your face. So no lidar isn’t required. Not that driving in a very dynamic world isn’t very difficult for computers to do, it’s not a matter of if, it’s just a matter of time.

      Would lidar allow “super human” driving abilities? Like seeing through fog and in every direction in the dark, sure. But it’s not required for the job at hand.

      • @[email protected]
        link
        fedilink
        English
        510 months ago

        You have eyes that are way more amazing than any cameras that are used in self driving, with stereoscopic vision, on a movable platform, and most importantly, controlled via a biological brain with millions of years of evolution behind it.

        I’m sorry, you can’t attach a couple cameras to a processor, add some neural nets, and think it’s anything close to your brain and eyes.

        • @droans
          link
          English
          210 months ago

          And also, cameras don’t work that great at night. Lidar would provide better data.

      • ShadowRam
        link
        fedilink
        410 months ago

        Do you have lidar on your head?

        Nope,

        And that’s exactly why humans crash. Constantly.

        Even when paying attention.

        They don’t have resolution in depth perception, nor the FOV.

        • Eager Eagle
          link
          English
          210 months ago

          And that’s exactly why humans crash. Constantly.

          No it isn’t. Anywhere in the world the vast majority of crashes are caused by negligence, speeding, distraction, all factors that can be avoided without increasing our depth perception accuracy.

        • Kata1yst
          link
          fedilink
          310 months ago

          Uhhhh… What the fuck else are the rest of you using?!

          • @JdW
            cake
            link
            English
            310 months ago

            Senses to support your sight when driving? Hearing and Balance come to mind, in that order of importance as supporting senses.

            • @[email protected]
              link
              fedilink
              English
              210 months ago

              I have no idea what sense of balance has to do with driving a car and even deaf people can get a driver’s license but okay. How’s this an argument for LIDAR again? It does not have anything to do with either of those things.

          • @boeman
            link
            English
            210 months ago

            My probing cane.

          • @hark
            link
            English
            210 months ago

            One obvious sense is hearing, as in hearing things like sirens to move out of the way.

          • Flying Squid
            link
            English
            010 months ago

            That’s like asking what the human equivalent of a GPU is. There isn’t one nor would there be because humans and computers are fundamentally different things.

      • @Spaceballstheusername
        link
        English
        210 months ago

        I remember watching a video talking about is there a camera that can see as well as a human eye. The resolution was there are cameras that see close but not as well and they are very big and expensive and the human brain filters much of it without you realizing. I think it could be done with a camera or two but I think we are not close to the technology for the near future.

      • Dr. Dabbles
        link
        English
        110 months ago

        Do you have CCDs in your head? No? This argument is always so broken it’s insane to see it still typed out as anything but sarcasm.

      • Eager Eagle
        link
        English
        0
        edit-2
        10 months ago

        A lot of LIDAR fans here for some reason, but you’re absolutely right.

        There’s just not a good amount of evidence pointing that accurate depth perception only obtained through LIDAR is required for self driving, and it also won’t solve the complex navigation of a real world scenario. A set of visible spectrum cameras over time can reconstruct a 3D environment well enough for navigation and it’s quite literally what Tesla’s FSD does.

        I don’t know why someone would still say it’s not possible when we already have an example running in production.

        “But Tesla FSD has a high disengagement rate” - for now, yes. But these scenarios are more often possible to be solved by high definition maps than by LIDAR. For anyone that disagrees, go to youtube, choose a recent video of Tesla’s FSD and try to find a scenario where a disengagement would have been avoided by LIDAR only.

        There are many parts missing for a complete autonomous driving experience. LIDAR is not one of them.

  • @dragontamer
    link
    English
    64
    edit-2
    10 months ago

    The elephant in the room is that the NHTSA still doesn’t have a director, and hasn’t had a long-term director since 2017.

    Steven Cliff was the director for 2 months in 2022. Aside from that, this important safety organization has been… erm… on autopilot (see what I did there?) and leaderless.

    How are we supposed to keep tabs on car safety if the damn agency in charge of automobile safety doesn’t even have a leader?

  • @[email protected]
    link
    fedilink
    English
    6010 months ago

    So, when are we changing this forums name from Technology to it’s actual purpose of late “every click and rage bait post about Tesla and Musk so people can circlejerk worse than reddit”?

    • Altima NEO
      link
      fedilink
      English
      1910 months ago

      It’s literally nothing but bullshit about Tesla and Twitter. All day long. No one cares!

      I want to know about some actual tech, not the drama.

      • @[email protected]
        link
        fedilink
        English
        7
        edit-2
        10 months ago

        You don’t care. If no one cared, there wouldn’t be so many posts and extremely active discussions about them. If you want different content, post it.

    • @[email protected]
      link
      fedilink
      English
      13
      edit-2
      10 months ago

      Elon Musk is a scammer. He’s good at that and it’s the only thing he’s good at

      Can we go now and talk about technology,?

    • @UnderpantsWeevil
      cake
      link
      English
      7
      edit-2
      10 months ago

      Unfortunately, all the current tech news is either people running naked scams or people debunking them.

      The tragedy of our modern era is how much money we’ve invested in selling people a box labeled “Newest Life Changing Gadget” that’s just full of rocks.

      Check out the podcast TrashFuture. They do a bit about a shitty tech enterprise every episode, sometimes twice a week. From Juicero to Neom, the list of awful tech bullshit is limitless.

    • @[email protected]
      link
      fedilink
      English
      410 months ago

      It seems like a new anti Tesla article hits lemmy every day. It’s boring at this point.

    • @30mag
      link
      English
      1
      edit-2
      8 months ago

      deleted by creator

    • Dr. Dabbles
      link
      English
      110 months ago

      Hopefully soon after the garbage copy/paste press release “articles” about “AI”, fake superconductors, and other nonsense stops being posted.

    • @NeoNachtwaechter
      link
      English
      1510 months ago

      Wow. Impressive collection.

      Somehow reminds me of Jehova’s witnesses and the end of the world :-)

    • @[email protected]
      link
      fedilink
      English
      710 months ago

      TBF, we have achieved a FSD that is safer than one human this year. But we took away the driver license of grandma so now we have to find another human that’s worse than FSD.

      • Overzeetop
        cake
        link
        English
        210 months ago

        I live in a small town with a large college. The students just came back for fall semester. I believe we have quite a few candidates for your list.

    • @spamfajitas
      link
      English
      1610 months ago

      I wonder how much impact there might have been on code quality when Elon forced lead devs from their projects at Tesla to work on Twitter. I’ve never seen a situation like that turn out well for either party.

    • QuantumEyetanglement
      link
      fedilink
      English
      -17
      edit-2
      10 months ago

      I wonder how this statistically compares to non-Tesla crashes?

      Edit: quick Google/math shows average rate of lethal automobile crashes at 12 per 100,000 drivers. Tesla has supposedly sold 4.5million cars. 4.5million divided by 17 deaths from the article = 1 death per 200,000 Tesla drivers.

      This isn’t exactly apples-to-apples and would love for some to “do the math” more accurately, but it seems like Tesla is much safer than a standard driver.

      The other confounding factor is we don’t know how many of these drivers were abusing autopilot by cheating the rules (it requires hands on the wheel and full attention on the road)

      • Heresy_generator
        link
        fedilink
        17
        edit-2
        10 months ago

        Your statistical analysis is so bad that it’s not even wrong. It’s just a pile of disparate data strung together with false assumptions.

        So all of those Teslas were sold in America? And all 4.5 million of those Teslas have Autopilot? And they’re in Autopilot mode 100% of the time?

        • @[email protected]
          link
          fedilink
          English
          910 months ago

          You forgot the most important issue: Tesla drivers are not representative of the average driver. They have more money and more education. They live in places with nicer weather. These all contribute to lower crash rates without self driving. I bet high end Mercedes have lower crash rates too, because people don’t defer maintenance and then drive them crazily in the snow.

          Compare apples to apples and I bet Teslas have average crash rates for luxury cars.

      • @TenderfootGungi
        link
        English
        1510 months ago

        It is not a valid comparison. Many deaths are in bad weather or in bad roads. Tesla self driving will not even turn on in these conditions. I do not believe apples to apples data exists.

      • @atempuser23
        link
        English
        510 months ago

        The true comparison is in miles per accident. Fatal accidents will be higher for older model cars. Not all Tesla cars have FSD. In many situations FSD is not available even on equipped cars. There is nothing to indicate from the current data that Telsa FSD is safer or more dangerous than the median driver.

        • Dr. Dabbles
          link
          English
          110 months ago

          This isn’t necessarily true either. The NHTSA Standing General Order data shows that Tesla reports a large number of crashes (which they get to cherry pick in a LOT of cases) under ADAS use compared to other brands. Taking conservative rollout numbers from companies like Honda shows that the crash per ADAS equipped vehicle rate is significantly higher.

          The real red flag in all all of this is that Tesla’s own reported marketing numbers for ADAS crashes wasn’t declining with newer releases over time. A rate that doesn’t improve as the CEO claims the software is already performing better than humans should instantly discredit the software, its performance, and any claims about new or improving features.

        • QuantumEyetanglement
          link
          fedilink
          English
          -410 months ago

          Let’s see it, show me the numbers! Everyone’s critiquing my quick mental math but I don’t see anyone contributing to fix it 🤷‍♀️. Will edit comment once I do!

  • AutoTL;DRB
    link
    fedilink
    English
    1010 months ago

    This is the best summary I could come up with:


    Back in 2016, Tesla CEO Elon Musk stunned the automotive world by announcing that, henceforth, all of his company’s vehicles would be shipped with the hardware necessary for “full self-driving.” You will be able to nap in your car while it drives you to work, he promised.

    But while Musk would eventually ship an advanced driver-assist system that he called Full Self-Driving (FSD) beta, the idea that any Tesla owner could catch some z’s while their car whisks them along is, at best, laughable — and at worst, a profoundly fatal error.

    Since that 2016 announcement, hundreds of fully driverless cars have rolled out in multiple US cities, and none of them bear the Tesla logo.

    His supporters point to the success of Autopilot, and then FSD, as evidence that while his promises may not exactly line up with reality, he is still at the forefront of a societal shift from human-powered vehicles to ones piloted by AI.

    You’ll also hear from a former Tesla employee who was fired after posting videos of FSD errors, experts who compare the company’s self-driving efforts to its competitors, and even from the competitors themselves — like Kyle Vogt, CEO of the General Motors-backed Cruise, who is unconvinced that Musk can fulfill his promises without rethinking his entire hardware strategy.

    Listen to the latest episode of Land of the Giants: The Tesla Shock Wave, a co-production between The Verge and the Vox Media Podcast Network.


    The original article contains 497 words, the summary contains 236 words. Saved 53%. I’m a bot and I’m open source!

    • Peanut
      link
      fedilink
      English
      1610 months ago

      I’ve been ranting about this since 2016.

      Having consumer trust in developing AI vehicles is hard enough without this asshole’s ego and lies muddying the water.

    • @almar_quigley
      link
      English
      1710 months ago

      Lol, ok. Your anecdotal experience can totally be believed over all the data gathered over years. Great. Thanks.

    • EnglishMobster
      link
      fedilink
      14
      edit-2
      10 months ago

      Counter-counterpoint: I’ve been using it since 2019. I think you’re exaggerating.

      • It aggressively tries to center itself, always. If you’re in a lane and it merges with a second lane, the car will swerve sharply to the right as it attempts to go back to the middle of the lane.

      • It doesn’t allow space for cars to merge until the cars are already merging. It doesn’t work with traffic; it does its own thing and is discourteous to other drivers. It doesn’t read turn signals; it only reacts to drivers getting over.

      • If a motorcycle is lane-splitting, it doesn’t move out of the way for the motorcycle. In fact, it assumes anything between lanes isn’t an issue. If something is partially blocking a lane but the system doesn’t recognize it as fully “your lane”, the default is to ignore it. The number of times I’ve had to disengage to dodge a wide load or a camper straddling two lanes is crazy.

      • With the removal of radar, phantom braking has become far, far worse. Any kind of weather condition causes issues. Even if you drive at sunset, the sun can dazzle the cameras and they don’t detect things that they should be able to - or worse, they detect problems which aren’t there.

      • It doesn’t understand road hazards. It will happily hit a pothole at 70 MPH. It will ignore road flares and traffic cones. When the lanes aren’t clearly marked (because the paint has worn away or because of construction), it can have dramatic behavior.

      • It waits so long to brake, and when it brakes it brakes hard. It accelerates just as suddenly, leading to a very jerky ride that makes my passengers carsick.

      The only time I trust FSD is when it’s stop-and-go traffic. Beyond that I have to pay so much attention to the thing that I might as well just drive myself. The “worst thing it can do” isn’t just detour; it’s “smash into the thing that it thought wasn’t an issue”.

      • @[email protected]
        link
        fedilink
        English
        3
        edit-2
        10 months ago

        I have only been driving a Tesla for a few days in 2022, but i fully agree with you, i wanted to specifically test the FSD and i had so many incidents where it tried to get into an appearing turning lane, even tough it should go straight, just straight up slowed to 10kph in a Tunnel where speed limit was 50, and there were blind corners because of “bad vision conditions” even the cruise control was annoying, it felt like my steering input was basically just a “suggestion” that i sometimes really had to force through against the will of the car because otherwise bad shit would’ve happened… Sports mode steering made that only slightly better in the dual motor Model Y

        Overall I enjoyed driving the ID3 more actually… at least that had solid and responsive steering that felt - compared to the Tesla - like driving a sports car… and i’ve driven the ID3 directly after the Tesla.

        Only good thing about the Tesla was acceleration.

      • @flames5123
        link
        English
        210 months ago

        It doesn’t read turn signals

        It does in the FSD beta (somewhat). It even brakes and allows them in if it detects that they have a signal on. It doesn’t understand merges as well, but it’s still better than regular autopilot. All your other points are pretty valid. I am constantly taking it out of AP Ana putting it back in during a city drive, even though I have “FSD”.