Ruling clears way for lawsuit brought against company over fatal crash in 2019 in which Stephen Banner was killed near Miami
Judge finds ‘reasonable evidence’ Tesla knew self-driving tech was defective::Ruling clears way for lawsuit brought against company over fatal crash in 2019 in which Stephen Banner was killed near Miami
It’s asinine that Tesla is trying to do full self driving without actually using some sort of LiDAR. Using video/photos to judge distance is just unreliable and stupid.
but it’s MUCH cheaper, so keeping with every other shitty idea he’s ever had, Musk was REALLY banking on Tesla engineers to make a crazy breakthrough so he could reap billions in reward.
It worked at SpaceX because of a perfect concoction of all the best rocket scientists and engineers wanting to work at SpaceX, since it was one of the only space programs not owned by a government and could push the boundaries, the technology being possible and wildly practical to implement, and massive government subsidies.
Tesla is in the car market, which is notoriously competitive and, while they do have massive government subsidies, they don’t have the best engineers and musk’s insistence that they “figure out” how to shove autonomous driving into a medium that simply doesn’t provide enough information drives even the better engineers away.
I really wish my government would stop funding his ego and let his fantasy projects die already.
With two offset cameras, depth is reliable, especially using a wide angle and narrow angle lens offset. This is what OpenPilot does with the Comma 3 (FOSS self driving).
Radar is better, but some automotive radar seems to only be great at short ranges (from my experience with my fork of OP in combination with radar built into a vehicle).
Do you work in the field? Sun/fog/etc are all things that can be handled with exposure adjustments. It’s one place a camera is more versatile than our eyes.
All that being said my experience is from indirect work on OpenPilot, not from Tesla. So a system that’s not commonly used by the average person, and does not have claims of commercial FSD.
No it’s not. World is filled with optical illusions that even our powerful brains can’t process and yet you expect two web cams to do. And depth is not the only thing that’s needed when it comes autonomous driving. Distance is an absolute factor. Case in point, two killed (if not more) bikers because they had 2 tail lights instead of one and Tesla thought it’s a car far away instead of motorcycle close by. Ran them over as if they were not there. Us as humans would see this rider and realize it’s a motorcyle… first because of sound second because our brain is better at reasoning. And we’d avoid the situation. This is why cars MUST have more sensors, because processing is lacking so much.
Sitting in a Tesla and watching it try to understand anything other than highway driving is so unnerving. It gets so much wrong about other cars’ direction of travel that it’s not too shocking one occasionally is plowed into or plows into someone else
I thought the whole point was to overcome human shortcomings, not just make a worse version of a human driver. Humans don’t even rely purely on visual cues.
It’s asinine that Tesla is trying to do full self driving without actually using some sort of LiDAR. Using video/photos to judge distance is just unreliable and stupid.
but it’s MUCH cheaper, so keeping with every other shitty idea he’s ever had, Musk was REALLY banking on Tesla engineers to make a crazy breakthrough so he could reap billions in reward.
It worked at SpaceX because of a perfect concoction of all the best rocket scientists and engineers wanting to work at SpaceX, since it was one of the only space programs not owned by a government and could push the boundaries, the technology being possible and wildly practical to implement, and massive government subsidies.
Tesla is in the car market, which is notoriously competitive and, while they do have massive government subsidies, they don’t have the best engineers and musk’s insistence that they “figure out” how to shove autonomous driving into a medium that simply doesn’t provide enough information drives even the better engineers away.
I really wish my government would stop funding his ego and let his fantasy projects die already.
The tech has gotten so cheap now that there is no reason to skimp out on it.
Oh there definitely is, marginally higher profits at the cost of public safety
A tale as old as capitalism: short term profit first, who gives a shit about later
And tesla is the car company who has the highest benefits by car sold.
We talk about a company who got rid of ultrasound sensors for make the car reverse parks.
With two offset cameras, depth is reliable, especially using a wide angle and narrow angle lens offset. This is what OpenPilot does with the Comma 3 (FOSS self driving).
Radar is better, but some automotive radar seems to only be great at short ranges (from my experience with my fork of OP in combination with radar built into a vehicle).
deleted by creator
Here is an alternative Piped link(s):
railroad crossing
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source; check me out at GitHub.
Do you work in the field? Sun/fog/etc are all things that can be handled with exposure adjustments. It’s one place a camera is more versatile than our eyes.
All that being said my experience is from indirect work on OpenPilot, not from Tesla. So a system that’s not commonly used by the average person, and does not have claims of commercial FSD.
No it’s not. World is filled with optical illusions that even our powerful brains can’t process and yet you expect two web cams to do. And depth is not the only thing that’s needed when it comes autonomous driving. Distance is an absolute factor. Case in point, two killed (if not more) bikers because they had 2 tail lights instead of one and Tesla thought it’s a car far away instead of motorcycle close by. Ran them over as if they were not there. Us as humans would see this rider and realize it’s a motorcyle… first because of sound second because our brain is better at reasoning. And we’d avoid the situation. This is why cars MUST have more sensors, because processing is lacking so much.
Here is an alternative Piped link(s):
Case in point
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source; check me out at GitHub.
Sitting in a Tesla and watching it try to understand anything other than highway driving is so unnerving. It gets so much wrong about other cars’ direction of travel that it’s not too shocking one occasionally is plowed into or plows into someone else
All depend how powerful is the computer managing the datas. A human brain does the job by example.
deleted by creator
But how many of those bad records are due to their eyes ?
Most causes of those bad records are bad decisions( checking phone, speeding, cutting lanes, etc). It is rarely due to bad sight.
The issue with lidar is bad weather. If it rains, or is foggy, it doesn’t work, or give weird result.
Apparently there is some radar which can see through bad weather.
I thought the whole point was to overcome human shortcomings, not just make a worse version of a human driver. Humans don’t even rely purely on visual cues.
When you drive, beside few exceptions, all our cues are visual.
No dude. Sound and air pressure are cues as well.
If audio clues was indispensable, deafs people wouldn’t drive.
What? A cue being solely sufficient to make a decision and a cue bei g used in conjunction with other things are two separate things.