Tesla Whistleblower Says ‘Autopilot’ System Is Not Safe Enough To Be Used On Public Roads::“It affects all of us because we are essentially experiments in public roads.”

  • @cm0002
    link
    English
    2441 year ago

    I lost all trust in their ‘Autopilot’ the day I read Musk said (Paraphrasing) “All we need are cameras, there’s no need for secondary/tertiary LIDAR or other expensive setups”

    Like TFYM? No backups?? Or backups to the backups?? On a life fucking critical system?!

    • @Ottomateeverything
      link
      English
      1091 year ago

      or other expensive setups

      As much as I lost trust in his bullshittery a long time ago, his need to mention the cost of critical safety systems is what stuck out to me the most here. That’s how you know the priorities are backwards.

      • TheRealKuni
        link
        English
        941 year ago

        Also, my robot vacuum has LiDAR. It’s not expensive relative to a car.

        • @[email protected]
          link
          fedilink
          English
          341 year ago

          Hell every iphone has lidar and the pro models have two lidar cameras. The tech is not very expensive, epecially not for a $80,000 car.

          My partner’s econobox has lidar for its cruise control, but Tesla can’t seem to figure out how to make it work.

          • @Sondermotor
            link
            English
            131 year ago

            Hell every iphone has lidar and the pro models have two lidar cameras. The tech is not very expensive, epecially not for a $80,000 car.

            Around the time Elon made the claim Lidar for automotive purposes was quite expensive. That additional cost would make the self driving product a lot less desirable. Up selling cruise control into “self driving” earned them a lot of money.

            Funnily enough all other aspects where Tesla has taken the expensive option the cult retail investors would claim it was brilliant decisions because economy of scale would kick in and make it cheaper in the long run.

            Lidar was obviously exempt from any such scale and future tech improvements, because reasons.

            My partner’s econobox has lidar for its cruise control, but Tesla can’t seem to figure out how to make it work.

            It could be very expensive for Tesla to start using Lidar, because they’ve sold a lot of cars with the promise that they have the hardware for self driving. Retrofitting a million cars would not only cost a lot in terms of gear and work, but it would put additional stress on an already poor service network.

            They have painted themselves into a corner. All because leadership thought self driving was a more or less solved problem almost a decade ago.

            • @APassenger
              link
              English
              71 year ago

              Rebrand to Ludicrous Self Drive and add back LIDAR.

            • @[email protected]
              link
              fedilink
              English
              4
              edit-2
              1 year ago

              Good point. I thought Teslas had radar for awhile though and they took it out?

              Was lidar that expensive in a car though? Because Infiniti started adding it in 2014 for the cruise control and those cars usually sell new for $50k if you get it fully loaded.

              And they could have added radar and sonar to assist the cameras at least. The radar couldn’t give 3d data, but it could say “yo bro that’s a solid object, not the skyline” at least.

              Good point on the promises though. They really fucked themselves with Elon’s claims.

              • @Sondermotor
                link
                English
                21 year ago

                I thought Teslas had radar for awhile though and they took it out?

                They decided radar was superfluous at one point during the pandemic. By sheer coincidence by the time supply chains were getting fucked. Hitting delivery targets were more important than safety.

                And they could have added radar and sonar to assist the cameras at least. The radar couldn’t give 3d data, but it could say “yo bro that’s a solid object, not the skyline” at least.

                They did do that. It can be pretty difficult to make sense of conflicting data like that. Tesla may have decided to not bother to solve such issues and hope less sensor data makes it easier to interpret.

                This is what Elon had to say about Tesla’s sophisticated radar data interpretation capabilities in 2016:

                In fact, an additional level of sophistication – we are confident that we can use the radar to look beyond the car in front of you by bouncing the radar signal off the road and around the car. We are able to process that echo by using the unique signature of each radar pulse as well as the time of flight of the photon to determine  that what we are seeing is in fact an echo in front of the car that’s in front of you. So even if there’s something that was obscured directly both in vision and radar, we can use the bounce effect of the radar to look in front of that car and still brake.

                It takes things to another level of safety.

                I guess the ability to see around cars in front of you got lost in some software update along the line. Otherwise removing radar necessarily meant reducing the safety of the system, or Elon lied in 2016.

                Was lidar that expensive in a car though? Because Infiniti started adding it in 2014 for the cruise control and those cars usually sell new for $50k if you get it fully loaded.

                It depends on what you want to do with the sensors. Somewhat accurately mapping what’s immediately in front of the car to slightly improve speed matching and false positive/negative rates for emergency breaking comes at a cheaper price than the capability to fully map the surroundings fast and accurately enough for a computer to make correct decisions.

        • kingthrillgore
          link
          fedilink
          English
          51 year ago

          It’s actually gotten cheaper since they figured out how to make it solid state.

      • frozen
        link
        fedilink
        English
        30
        edit-2
        1 year ago

        Skimping on cost is how disasters happen. Ask Richard Hammond. “Spared no expense” my ass, hire more than 2 programmers, you cheap fuck.

        Edit: This was supposed to be a Jurassic Park reference, but my dumb ass mixed up John Hammond and Richard Hammond. That’s what I get for watching Top Gear and reading at the same time.

        • eric
          link
          English
          51 year ago

          Were Richard Hammond’s many crashes a result of cost skimping? If so, I had no idea. Could you elaborate?

          • @[email protected]
            link
            fedilink
            English
            51 year ago

            I was under the impression that Hammond’s serious crashes were a combination of bad luck and getting a bit too spicy when driving in some already-risky situations. I, too, would appreciate some corroboration.

            • eric
              link
              English
              51 year ago

              Same here. I did a little googling and can’t find any corroborating evidence, but I also learned that Hammond’s Grand Tour insurance premiums are now more expensive than Top Gear’s budgets were for entire specials.

              • @Caradoc879
                link
                English
                41 year ago

                That has to be the real reason they’ve canceled Grand Tour

              • @[email protected]
                link
                fedilink
                English
                21 year ago

                I mean… given that he has had two very well documented and life-threateningly catastrophic crashes in the course of making car shows… the insurance company underwriting his policies isn’t out of line.

                • eric
                  link
                  English
                  31 year ago

                  I figured insuring him would be expensive, but it’s more the magnitude of his premiums that shocked me.

        • @[email protected]
          link
          fedilink
          English
          31 year ago

          As someone who hasn’t much watched Top Gear, I was cracking up at your Jurassic Park reference until I saw your edit and was like “Wait a minute.”

          Top Gear? Jurassic Park? Either way: Hold on to your butts.

          😆

    • @[email protected]
      link
      fedilink
      English
      102
      edit-2
      1 year ago

      The crazier and stupier shit was that part of his justification was that “people drive and they only have eyes. We should be able to do the same.”

      Its a stunningly idiotic justification, and yet here we are with millions of these “eyes only” teslas on the road.

      • Chris Ely
        link
        fedilink
        431 year ago

        That’s terrifying for showing how little he understands about the problem he is attempting to solve.

        Humans use up to four senses at times to accomplish the task of driving.

        @mosiacmango
        @cm0002

        • @[email protected]
          link
          fedilink
          English
          311 year ago

          I can add more, we don’t only have five senses. Elementary school propoganda that is. Here’s all the ones I can think of while driving.

          1. Vision
          2. Hearing
          3. Tactile feedback from wheel, pedals, you could break this down further into skin tactile pressure receptors, and also receptors of muscle tension, though muscle tension and stretching receptors also involved in number 4
          4. Proprioception, where your limbs and body are in space
          5. Rotational acceleration (semi circular canals)
          6. Linear acceleration (utricle and saccule)
          7. Smell, okay this might be a stretch but, some engine issues can be smelly

          And that doesn’t even consider higher order processing and actual integration of all these things which despite all it’s gains with Ai recently can’t match all the capabilities of the brain to integrate all that information or deal with novel stimuli. Point is Elon, add more sensors to your dang cars so they’re less likely to kill people. And people aren’t even perfect at driving, why would we limit it to only our senses anyways? So dumb

        • @RainyRat
          link
          English
          131 year ago

          Did he also eat his meat raw and sleep in trees?

          • @[email protected]
            link
            fedilink
            English
            31 year ago

            Nah that would be silly.

            Did deflower a generation of young girls though, who would suffer the agony of his presumed halitosis

    • JohnEdwa
      link
      fedilink
      English
      45
      edit-2
      1 year ago

      Ah, but you see, his reasoning is that what if the camera and lidar disagree, then what? With only a camera based system, there is only one truth with no conflicts!

      Like when the camera sees the broad side of a white truck as clear skies and slams right at it, there was never any conflict anywhere, everything went just as it was suppo… Wait, shit.

      • @[email protected]
        link
        fedilink
        English
        301 year ago

        sees the broad side of a white truck as clear skies and slams right at it

        RIP Joshua Brown:

        The truck driver, Frank Baressi, 62, told the Associated Press that the Tesla driver Joshua Brown, 40, was “playing Harry Potter on the TV screen” during the collision and was driving so fast that “he went so fast through my trailer I didn’t see him”.

        • @girthero
          link
          English
          131 year ago

          he went so fast through my trailer I didn’t see him”.

          Lidar would still prevail over stupidity in this situation. It does a better job detecting massive objects cars can’t go through.

      • @[email protected]
        link
        fedilink
        English
        01 year ago

        what if the camera and lidar disagree, then what?

        This (sensor fusion) is a valid issue in mobile robotics. Adding more sensors doesn’t necessarily improve stability or reliability.

        • @ZapBeebz_
          link
          English
          251 year ago

          After a point, yes. However, that point comes when the sensor you are adding is more than the second type in the system. The correct answer is to work into your algorithm a weighting system so the car can decide which sensor it trusts to not kill the driver, i.e. if the LIDAR sees the broadside of a trailer and the camera doesn’t, the car should believe the LIDAR over the camera, as applying the brakes and speeding into the obstacle at 60mph is likely the safer option.

          • @[email protected]
            link
            fedilink
            English
            21 year ago

            Yes the solution is fairly simple in theory, implementing this is significantly harder, which is why it is not a trivial issue to solve in robotics.

            I’m not saying their decision was the right one, just that his argument with multiple sensors creating noise in the decision-making is a completely valid argument.

            • @[email protected]
              link
              fedilink
              English
              31 year ago

              Doesn’t seem too complicated… if ANY of the sensors see something in the way that the system can’t resolve then it should stop the vehicle/force the driver to take over

              • @[email protected]
                link
                fedilink
                English
                2
                edit-2
                1 year ago

                Then you have a very unreliable system, stopping without actual reason all the time, causing immense frustration for the user. Is it safe? I guess, cars that don’t move generally are. Is it functional? No, not at all.

                I’m not advocating unsafe implementations here, I’m just pointing out that your suggestion doesn’t actually solve the issue, as it leaves a solution that’s not functional.

                • @[email protected]
                  link
                  fedilink
                  English
                  21 year ago

                  If they’re using such unreliable sensors that they’re getting false positives all the time the system isn’t going to be functional in the first place.

              • Kogasa
                link
                fedilink
                English
                11 year ago

                “seeing an obstacle” is a high level abstraction. Sensor fusion is a lower level problem. It’s fundamentally kinda tricky to get coherent information out of multiple sensors looking partially at the same thing in different ways. Not impossible, but the basic model is less “just check each camera” and more sheafs

    • @chitak166
      link
      English
      -51 year ago

      To be fair, humans have proven all you need are visual receptors to navigate properly.

      • @Maalus
        link
        English
        71 year ago

        To be fair, current computers / AI / whatever marketing name you call them aren’t as good as human brains.

        • @chitak166
          link
          English
          -81 year ago

          No, but they can be improved to the point where all that’s necessary are cameras and the means to control the vehicle.

          • @Maalus
            link
            English
            41 year ago

            I take it you are an expert in the field to be saying this?

              • @Maalus
                link
                English
                21 year ago

                No, they don’t and that’s the entire point in all of this. Tesla autopilot sucks and it will suck and kill people. But fanboys like you would rather “look to the future” instead of realistically looking at it.

      • @[email protected]
        link
        fedilink
        English
        41 year ago

        Visual receptors… And 3-dimensional vision with all the required processing and decision making behind that based on the visual stimuli, lol.

      • @cm0002
        link
        English
        21 year ago
        1. And how many vehicle accidents and deaths are there today? Proven that humans suck at driving maybe

        2. No we don’t, we use sight, sound and touch/feeling to drive at a minimum

        • @chitak166
          link
          English
          -1
          edit-2
          1 year ago

          Touch? Sure, barely. But you can drive without being able to hear.

          I’d also wager you can get a license if you have that rare disease that prevents you from feeling. Since, you know, how little we use touch and hearing to drive.

          But hey? Maybe I’m wrong. Maybe you can provide a source that says you can’t get licensed if you have that disease or if you’re deaf. That would prove your point. Otherwise, it proves mine.

    • @Gargantu8
      link
      English
      -13
      edit-2
      10 months ago

      deleted by creator

      • @[email protected]
        link
        fedilink
        English
        251 year ago

        Uhhhh…

        …any level 4 car actually, according to the federal governments and all the agencies who regulate this stuff.

        NAVYA, Volvo/Audi, Mercedes, magna, baidu, Waymo.

        Tesla isn’t even trying to go past level 3 at this point.

        • @Gargantu8
          link
          English
          6
          edit-2
          10 months ago

          deleted by creator

          • @chakan2
            link
            English
            101 year ago

            It sounded like sarcasm rather than an honest question. Like “Find me a better autopilot” rather than “What manufacturer would you recommend for autopilot?”

            • @Gargantu8
              link
              English
              -3
              edit-2
              10 months ago

              deleted by creator

              • @[email protected]
                link
                fedilink
                English
                5
                edit-2
                1 year ago

                I took it as condescending, just poorly written. Also lots of Tesla fanboys on here. Glad your not

                • @Gargantu8
                  link
                  English
                  -2
                  edit-2
                  10 months ago

                  deleted by creator

      • @[email protected]
        link
        fedilink
        English
        171 year ago

        A 2014 Infiniti can drive itself more safely on the highway than a Tesla. The key here is they didn’t lie about the cars capabilities so they didn’t encourage complacency.

        In the city though, yeah you’ll need to look at other level 4 cars.

      • @chakan2
        link
        English
        3
        edit-2
        1 year ago

        What brand of car has better autopilot with other sensors?

        All of them. The other automakers didn’t fire their engineers during a hissy fit.

    • @[email protected]
      link
      fedilink
      English
      -171 year ago

      Bot to be a hard-on about it, but if the cameras hace any problem autopilot ejects gracefully and hands it over to the driver.

      I aint no elon dicj rider, but I got FSD andd the radar would see manhole covers and freak the fuck out. It was annoying as hell and pissed my wife off. The optical depth estimation is now far more useful than the radar sensor.

      Lidar has severe problems too. I’ve used it many times professionally for mapping spaces. Reflective surfaces fuck it up. It delivers bad data frequently.

      Cameras will eventually be great! Really they already are, but they’ll get orders of magnitude better. Yeah 4 years ago the ai failed to recognize a rectagle as a truck, but it aint done learning yet.

      That driver really should have been paying attention. Thee car fucking tells you to all the time.

      If a camera has a problem the whole system aborts.

      In the future this will mean the car will pull over, but it’'s, as it makes totally fucking clear, in beta. So for now it aborts and passes control to the human that is payong attention.

      • @BaronDoggystyleVonWoof
        link
        English
        171 year ago

        So I drive a tesla as well. Quite often I get the message that the camera is blocked by something (like sun, fog, heavy rain).

        You can’t have a reliable self driving system if that is the case.

        Furthermore, isn’t it technically possible to train the lidar and radar with Ai as well?

        • @[email protected]
          link
          fedilink
          English
          3
          edit-2
          1 year ago

          Furthermore, isn’t it technically possible to train the lidar and radar with Ai as well?

          Of course it is, functionally both the camera and lidar solutions work in vector-space. The big difference is that a camera feed holds a lot more information beyond simple vector-space to feed the AI straining with than a lidar feed ever will.

      • @NeoNachtwaechter
        link
        English
        121 year ago

        any problem autopilot ejects gracefully and hands it over to the driver.

        Gracefully? LMAO

        You can come back when it gives at least 3 minutes warning time in advance, so that I can wake up, get my hands out of the woman, climb into the driver seat, find my glasses somewhere, look around where we are, and then I tell that effing autopilot that it’s okay and it is allowed to disengage now!

        • @[email protected]
          link
          fedilink
          English
          01 year ago

          Yes, that’s exactly how autopilots in airplanes work too… 🙄

          I think camera FSD will get there, but I also think there are additional sensors needed (perhaps not lidar necessarily) to increase safety and like the point of the article states… a shitload more testing before it’s allowed on public roads. But let’s be reasonable about how the autopilot can disengage.

          • @chakan2
            link
            English
            01 year ago

            I think camera FSD will get there

            Tesla’s won’t. Musk fired all his engineers. Mercedes has a better driving record these days.

              • @chakan2
                link
                English
                -11 year ago

                Well…there’s those two Indian interns that can’t quit because they need their Visa. But they just get Musk coffee and turn away all the summons for child support.

          • @NeoNachtwaechter
            link
            English
            01 year ago

            how autopilots in airplanes work

            That was interesting for some people, before we had autonomy levels defined for cars. Nobody wants to know that anymore.

      • @[email protected]
        link
        fedilink
        English
        81 year ago

        Starting off with 3d data will always be better than inferring it. Go fire up Adobe after effects and do a 3d track and see how awful it is, now that same awful process drives your car.

        The AI argument falls short too because that same AI will be better if it just starts off with mostly complete 3d data from lidar and sonar.

        • @[email protected]
          link
          fedilink
          English
          11 year ago

          Lidar and sonar are way lower resolution.

          Sonar has a hard time telling the difference between a manhole cover, large highway sign and a brick wall.

          • @[email protected]
            link
            fedilink
            English
            11 year ago

            Okay? The resolution doesn’t help apparently because Teslas are hitting everything. Sonar can look ahead several cars and lidar is 3d data. Combining those with a camera is the only way to do things safely. And lidar is definitely not low resolution.

      • @elephantium
        link
        English
        41 year ago

        ejects gracefully and hands it over to the driver

        This is exactly the problem. If I’m driving, I need to be alert to the driving tasks and what’s happening on the road.

        If I’m not driving because I’m using autopilot, … I still need to be alert to the driving tasks and what’s happening on the road. It’s all of the work with none of the fun of driving.

        Fuck that. What I want is a robot chauffer, not a robot version of everyone’s granddad who really shouldn’t be driving anymore.

        • @[email protected]
          link
          fedilink
          English
          11 year ago

          After many brilliant people trying for decades, it seems you can’t get the robot chauffeur without several billion miles of actual driving data, sifted and sorted into what is safe, good driving and what is not.

      • @chakan2
        link
        English
        21 year ago

        ejects gracefully and hands it over to the driver.

        Just in time to slam you into an emergency vehicle at 80…but hey…autopilot wasn’t on during the impact, not Musk’s fault.

        • @[email protected]
          link
          fedilink
          English
          01 year ago

          Nah, with hands on the wheel, looking at the road, the driver, who agrees they will pay attention, will have disengaged the system long before it gets to that point.

          The system’s super easy to disengage.

          It’s also getting better every year.

          5 years ago my car could barely change lanes on the highway. Now it navigates lefts at 5 way lighted intersections in big city traffic with idiots blocking the intersection and suicidal cyclists running red lights as well as it was changing lanes on highway… And highway lane changes are extremely reliable. Cant remember my last lane change disengagement. Same car; just better software.

          I bet 5 years from now it’ll be statistically safer than humans… Maybe not same car. Hope it’s my car too, but it’s unclear if that processor is sufficient…

          Anyway, it’ll keep improving from there.

      • Noxy
        link
        fedilink
        English
        11 year ago

        good thing regular cameras aren’t affected by reflective surfaces

        oh wait

  • @EatYouWell
    link
    English
    461 year ago

    Isn’t this already an established fact?

    • @jimmydoreisalefty
      link
      English
      161 year ago

      Proof by looking at internal information and data.

      The data leaked by Krupski included lists of Tesla employees, often featuring their social security numbers, in addition to thousands of accident reports, and internal Tesla communications. Handelsblatt and others have used these internal memos and emails as the basis for stories on the dangers of Autopilot and the reasons for the three-year delay in Cybertruck deliveries. From NYT:

      • @[email protected]
        link
        fedilink
        English
        -71 year ago

        How does any of that prove the claim? Surely independent crash data would show these vehicles are involved in many more accidents than other vehicles if it’s true, but that doesn’t seem to be the case.

    • @NeoNachtwaechter
      link
      English
      51 year ago

      Is it really whistleblowing

      It is, and it is important.

      Employees are usually bound by loyalty and contract not to tell any internals. But public knowledge often needs confirmation, otherwise it is only rumours.

    • @Geobloke
      link
      English
      41 year ago

      Well they got rid of their relations department because Elon’s twitter is all you need

    • @Kbobabob
      link
      English
      -11 year ago

      I don’t get it…

  • @[email protected]
    link
    fedilink
    English
    41 year ago

    Gee thanks for reporting on the obvious, Jalponik.

    We knew this. And even this whistleblower report is old.

    What a garbage news outlet.

  • @linearchaos
    link
    English
    -31 year ago

    Unfortunately this is one of those things that you can’t significantly develop/test on closed private streets. They need the scale, and the public traffic, and the idiots in the drunkards and the kids speeding. The only thing that’s going to stop them from working on autopilot will be that it’s no longer financially reasonable to keep going. Even a couple handfuls of deaths aren’t going to stop them.

    • @Ottomateeverything
      link
      English
      341 year ago

      Unfortunately this is one of those things that you can’t significantly develop/test on closed private streets.

      Even if we hold this to be true (and I disagree in large part), the point is that Tesla’s systems aren’t at that stage yet. Failing to recognize lights correctly during live demos and such are absolutely things you can test and develop on closed streets or in a lab. Tesla’s shouldn’t be allowed on roads until they’re actually at a point where there are no glaring flaws. And then they should be allowed in smaller numbers.

      • @linearchaos
        link
        English
        -21
        edit-2
        1 year ago

        Do you really think they didn’t test that before they got to this point?

        I’m willing to bet they had been through that intersection before hundreds of times and never seen this. It’s not like it can’t detect a stoplight and they’re just out there randomly running through them all.

        Of the millions of variables that were around them something blinded it to light this time. The footage from that run has probably been reviewed at nauseam at this point and is done more for them finding the problem than they could have done sitting in a closed warehouse making guesses when the car never fails to detect a red light.

        edit: look keep smacking that downvote, but it’s not going to change anything. I hate musk too, but we’re going to make progress toward automated driving unless it becomes more dangerous than existing driver. In the next generation or so, most driving will become automated and all deaths by automobiles will drop significantly. Old and young people will get where they need to go. You cannot automate driving without driving in the real world. If you think they haven’t been doing this in a simulation for a decade, you’re on crack.

        • @pivot_root
          link
          English
          231 year ago

          I still wouldn’t trust the company with a CEO who unilaterally decided that not having redundant systems makes for a better product.

          • @linearchaos
            link
            English
            11 year ago

            I absolutely don’t trust the CEO. I don’t even need to trust the company, there are a dozen others trying to work out the same problem.

        • Tar_Alcaran
          link
          fedilink
          English
          161 year ago

          Do you really think they didn’t test that before they got to this point?

          Yes.

          • @linearchaos
            link
            English
            -81 year ago

            well if you’re not going to discuss things in good faith, good bye

    • @TheGrandNagus
      link
      English
      10
      edit-2
      1 year ago

      That’s true, but I think the issue people have with “AutoPilot” is about marketing.

      Tesla brands their cars’ solution as being a full replacement for human interaction and word from Musk, other Tesla employees, media personalities close to Tesla, and fanboys all make out like the car drives itself and the only reason you need a driver in place is to satisfy laws.

      It’s bullshit. They know exactly what they’re doing when they do the above, when they call their system “AutoPilot”, when Musk makes claims his cars can travel from one side of the US to the other without human interaction (only to never actually do it, of course!), and sells car upgrades as Full Self Driving support.

      If they branded it as Assisted Driving, Advanced Cruise Control, Smart Cruise, or something along those lines, like all the other carmakers do with their similar systems, I’d be less inclined to blame Tesla when there’s an unfortunate incident. I think most would agree with me, too.

      But Tesla markets and encourages, both officially and unofficially, that their cars have the ability to drive themselves, look after themselves, and that you’re safe when using the system. It’s a lie and I’m absolutely astounded they’ve had little more than a series of slaps on the wrist for it in most markets.

      • @linearchaos
        link
        English
        -21 year ago

        100% accurate.

        They want people to use it so they get data from it. Accidents and deaths will happen… honestly, they’ll always happen… they happen now without it, it’s just more acceptable because it’s human error. Road safety is absolutely awful.

        The reason they get away with it is Lobbying, Money and Political favors. They got where they are by greasing a whole shit ton of wheels with dumptrucks of money.

        Shitty means, but pretty righteous ways.

      • @Takumidesh
        link
        English
        -11 year ago

        Tbf, Tesla’s are the only cars that actually know you are on your phone and/or not paying attention.

        • Flying Squid
          link
          English
          41 year ago

          And then make you look at and use a giant touchscreen to control anything, taking your attention off the road.

        • Zoolander
          link
          English
          3
          edit-2
          1 year ago

          Removed by mod

          • @Takumidesh
            link
            English
            11 year ago

            So it is true, but it wasn’t true in the past? It can also be disabled by breaking the camera or putting tape over it. The point is that it does do that, which is a true statement.

    • @[email protected]
      link
      fedilink
      English
      01 year ago

      Should a couple handfuls of deaths if as you said you can’t test it any other way? Autopilot systems could already be saving thousands of lives if more widely deployed and a lack of good reliable autopilot systems has the opportunity cost of blood on our hands. Human drivers are well established to be dangerous. Testing and release of autopilot systems should be done as safely as possible, but to think the first decade or so of these systems will be flawless seems unreasonable.

    • gregorum
      link
      fedilink
      English
      -31 year ago

      The fact is that most technology that we take for granted today went through a similar evolutionary phase with public use before they became as safe as they are now, especially cars themselves. For well over a century, the automobile has made countless leaps and bounds in safety improvements due to data gathered from public use studies.

      We learn by doing.

      • kingthrillgore
        link
        fedilink
        English
        41 year ago

        That’s fine but Waymo, Cruise et al do trials on closed courses and in co-operation with states to assure a high degree of public safety. Tesla is testing without asking regulators.

        • gregorum
          link
          fedilink
          English
          21 year ago

          Do they? I actually, and honestly, have very little to no knowledge of how companies gather, which is why I did not mention them. Can you provide any links to any information about them? I honestly would like to learn more.

  • @WoahWoah
    link
    English
    -11
    edit-2
    1 year ago

    Tesla “autopilot” averages one airbag deployment every five million miles.

    The average driver in the U.S. averages one every 600,000 miles.

    Idk. Doesn’t seem like it works perfectly, but it does seem to work pretty well.

    • @[email protected]
      link
      fedilink
      English
      261 year ago

      The comparison is a little flat when you consider autopilot has minimum viable weather and road condition requirements to activate, no snow or hail, etc, while human drivers must endure and perform optimally in all road and weather conditions.

      • @WoahWoah
        link
        English
        -1
        edit-2
        1 year ago

        deleted by creator

    • @linearchaos
      link
      English
      31 year ago

      Man that’s some interesting brigading we have going on here. You throw facts at them they just explode.

  • @[email protected]
    link
    fedilink
    English
    -13
    edit-2
    1 year ago

    He says this yet they’re already out on the roads logging millions of miles without any outsized danger. Just because we get sensational headlines about a driver behind the wheel who crashed into the side of a semi, doesn’t mean they’re any more dangerous than any other car. AFAIK Tesla still has far fewer wrecks than many others. These driving aids have a lot of room for improvement but they only need to be better than an average driver in order to reduce accidents.

    • @[email protected]
      link
      fedilink
      English
      12
      edit-2
      1 year ago

      I think it’s very likely that specially Tesla would go ahead with technology that is dangerous in certain situations, as long as it only happens rarely.

      We all know what kind of a man Elon is.

      You would not see the same in other established car brands.

      Elon is the kind of man who would break not only eggs, but the chickens and the chicken farmers, to make his omelet, and if people get hurt, he would blame it on them for being stupid.

      I would never trust a Tesla because I obviously don’t trust Elon and nobody should.

    • Ghostalmedia
      link
      English
      101 year ago

      It’s not just about what he’s saying. It’s about the internal data he’s leaking to back up the claims.

    • JWayn596
      link
      English
      51 year ago

      Nah it’s because they decided to use cameras instead of LiDAR and then try to make it autonomous instead of driver aid.

      AI is at its best when it’s opening up productivity and freedom to think critically or leisurely, the same way sticky notes help someone study.

      • @[email protected]
        link
        fedilink
        English
        11 year ago

        Autopilot is just advanced cruise control. I think you’re conflating it with FSD which is their autonomous driving feature.

  • @KoalaUnknown
    link
    English
    -141 year ago

    I mean, are humans really any better?

    • @[email protected]
      link
      fedilink
      English
      211 year ago

      I know it’s not the answer you’re looking for but, what is safer for pedestrians, cyclists and other drivers, is to have less cars on the roads. Buses can move dozens of people with a single trained professional driver. Trains can move hundreds. It’s illogical to try to push for autonomous cars for individuals when we already have “self driving” technologies that are much much safer and much more efficient.

      • @KoalaUnknown
        link
        English
        21 year ago

        I agree. That’s why I don’t own a car.

      • @Cold_Brew_Enema
        link
        English
        -201 year ago

        You anti car people find any way to insert your views into a conversation. Let me guess, you also do Crossfit?

        • @[email protected]
          link
          fedilink
          English
          171 year ago

          Being “anti car” is good for people that love cars. More public transit means less trafic, less congestion, less demand for gas and generally just more space for people that actually like to drive cars.

          Plus, if some people don’t want to drive a car and just want to get places, maybe don’t get a car? There’s already safe and proven “technology” to do that. I understand the added safety bonus of “autonomous” cars but let’s be real, it’s not advertised as something to boost the safety of everyone around, it’s advertised as “autopilot” or even worse, “Full Self Driving”.

          I am certainly anti car, but pointing out the flaws in “FSD” or “autonomous cars” and how it’s being falsely marketed to people is also on topic and is not exactly “inserting my views”. People can still love cars and use them, just don’t BS us with the “FSD” and “autonomous” spiel.

    • Ghostalmedia
      link
      English
      101 year ago

      Depends on the Autopilot feature.

      I was test driving model 3 and summon almost ran over a little kid in the parking lot until my wife ran in front of the car.

      At least when my car’s collision sensors misread something, my eyeballs are there for redundancy.

    • JohnEdwa
      link
      fedilink
      English
      4
      edit-2
      1 year ago

      Someone paying proper attention probably would be. But a huge chunk of accidents happen because idiots are looking at their phones or fall asleep on the wheel, and at least a self driving cars, even Teslas on Autopilot, won’t do that.

      • @[email protected]
        link
        fedilink
        English
        41 year ago

        No, they just relinquish control to a sleepy driver without a warning whenever they are about to crash.

        • @[email protected]
          link
          fedilink
          English
          41 year ago

          We aren’t at the point yet — with any self-drive car — where you should be behind the wheel unless you’re absolutely capable of taking over in seconds.

        • JohnEdwa
          link
          fedilink
          English
          11 year ago

          If you are referring to autopilot, yeah, technically it does that - it turns off once it realises it can’t do anything anymore to avoid the collision so that it doesn’t speed off afterwards due to damaged sensor or glitches etc. But the whole “autopilot turns off so it doesn’t show in statistics” was a blatant lie as Tesla counts all crashes where it has been on before the crash.

          • @tagliatelle
            link
            English
            11 year ago

            Do they count the times the human driver had to take control to avoid a crash?

            • JohnEdwa
              link
              fedilink
              English
              1
              edit-2
              1 year ago

              We also receive a crash alert anytime a crash is reported to us from the fleet, which may include data about whether Autopilot was active at the time of impact. To ensure our statistics are conservative, we count any crash in which Autopilot was deactivated within 5 seconds before impact, and we count all crashes in which the incident alert indicated an airbag or other active restraint deployed. https://www.tesla.com/en_eu/VehicleSafetyReport

              In the case the crash happened later than 5 seconds after Autopilot was disabled, or it was never used in the first place, it would be in the “Tesla vehicles not using autopilot technology” part of the data.

              As for automatically detecting not-crashes, that’s a bit harder to do don’t ya think?