Tesla Recalls 362,758 Vehicles: “Full Self-Driving Software May Cause Crash”

Tsing

The FPS Review
Staff member
Joined
May 6, 2019
Messages
11,214
Points
83
Tesla has announced that is recalling 362,758 of its EVs, including certain 2016–2023 Model S and Model X, 2017–2023 Model 3, and 2020–2023 Model Y vehicles. According to a safety recall report from the National Highway Traffic Safety Administration (NHTSA), Tesla's Full Self-Driving Beta (FSD Beta) software is to blame, described as unsafe for problems that include entering a stop sign-controlled intersection without coming to a complete stop. Tesla plans to release an over-the-air software update, free to charge, to fix the issue.

See full article...
 
I dream of self driving cars. I think tesla is much farther from it than what they let on.
 
I dream of self driving cars. I think tesla is much farther from it than what they let on.
Yup.

At first blush it sounds easy - just keep it between the lines and recognize a few standardized signs

But… it’s all those corner cases and everything else out on the road. No one predicts a deer jumping out, or the meth head who runs the red light, or the kid playing ball in the street, or the big rig driver who falls asleep and drifts lanes, or that patch of black ice… etc etc

Can an automated system react faster to all of those events than a person? Yeah, sure. But will it always act appropriately, and more importantly, will people ever learn to trust it with their lives at stake?
 
I thought that was a feature, not a bug :p

But will it always act appropriately, and more importantly, will people ever learn to trust it with their lives at stake?
Eventually it will become good enough, with AI., maybe within 10 years. The question is whether it will be cheap and seamless enough for mass adoption. Tesla's system sucks because it relies only on cameras. If you are serious about self driving you need more than one type of sensor, which shows Tesla is not serious.
 
I thought that was a feature, not a bug :p


Eventually it will become good enough, with AI., maybe within 10 years. The question is whether it will be cheap and seamless enough for mass adoption. Tesla's system sucks because it relies only on cameras. If you are serious about self driving you need more than one type of sensor, which shows Tesla is not serious.
I do agree, I think it will get better than a typical human within that time frame - especially if you can get some Vehicle-to-Vehicle communication standard in use.

But I think a big part of that safety also requires AI/Automated driving to become the majority of drivers on the road. Especially if V2V comms come into play - an automated system is much more predictable, and can communicate intentions and conditions to nearby vehicles for them to adjust and react to. You can never eliminate all spontaneity out there, but it mitigates a lot of it.

Until you get to that point, it's AI or a state table or whatever algorithm the car is using, trying to react to very unpredictable people and unpredictable situations, and you can throw as many sensors as you want at that but your still forced to just react to whatever is thrown your way after it occurs - in every single situation that comes up.

Here are some scary statistics that are relevant:

Driving ability[edit]​

Svenson (1981) surveyed 161 students in Sweden and the United States, asking them to compare their driving skills and safety to other people's. For driving skills, 93% of the U.S. sample and 69% of the Swedish sample put themselves in the top 50%; for safety, 88% of the U.S. and 77% of the Swedish put themselves in the top 50%.[29]

McCormick, Walkey and Green (1986) found similar results in their study, asking 178 participants to evaluate their position on eight different dimensions of driving skills (examples include the "dangerous–safe" dimension and the "considerate–inconsiderate" dimension). Only a small minority rated themselves as below the median, and when all eight dimensions were considered together it was found that almost 80% of participants had evaluated themselves as being an above-average driver.[30]

One commercial survey showed that 36% of drivers believed they were an above-average driver while texting or sending emails compared to other drivers; 44% considered themselves average, and 18% below average.[31]

 
Last edited:
I've been watching traffic accident dashcams for 10 years as entertainment. And 90% of them is very predictable and 99% is completely and easily avoidable even with current self driving technology. Yes, there are few cases where there is literally nothing you can do, you are F_ed so it hardly matters whether you are in control or an AI.

Those against self driving will forever be like the anti-seatbelt guys. They have one anecdotal story where someone survived a big crash not wearing a seatbelt, and they insist on betting their life on that one freak stroke of luck. That's like putting your life on a lottery.
 
I'd be less anti-seatbelt laws if I didn't see videos of cops doing a 180 to ticket a guy pulling into a gas station and dares to unbuckle his belt a mere 5 seconds before coming to a full stop. Then he shoots the driver because he was too slow to get his license etc. out of the car....near gas pumps.

Point being, crazy happens. But you know, the AI will be written competently. We'll have cops who are competent and not overbearing ticket hungry freaks looking to make money for the state. Hope springs eternal I guess.
 
I'd be less anti-seatbelt laws if I didn't see videos of cops doing a 180 to ticket a guy pulling into a gas station and dares to unbuckle his belt a mere 5 seconds before coming to a full stop. Then he shoots the driver because he was too slow to get his license etc. out of the car....near gas pumps.

Point being, crazy happens. But you know, the AI will be written competently. We'll have cops who are competent and not overbearing ticket hungry freaks looking to make money for the state. Hope springs eternal I guess.
Crazy happens once in a million interactions, but crazy gets remembered and posted, therefore frequency bias makes it seem like a common occurrence.
 
Once we go AI drivers, cops will ticket for ANY infraction that they can, and the cost of registering a AI driven car will go through the roof to compensate for states missed 'voluntary' income from driving infractions. Just watch.
 
Those against self driving will forever be like the anti-seatbelt guys.
I'm not anti-seatbelt. I think they are a good idea and all cars should include them. I'm against anti-seat belt laws though. What I want to do in my own car is my own business. And if I don't wear my seatbelt, it doesn't harm anyone other than myself. I think they are a great idea, but I think enforcement of wearing them is over the line.
 
Once we go AI drivers, cops will ticket for ANY infraction that they can, and the cost of registering a AI driven car will go through the roof to compensate for states missed 'voluntary' income from driving infractions. Just watch.
Yup. I am wondering what happens with insurance. I wouldn't want to be an AI Car company because the liability would be astronomical.
 
I'm not anti-seatbelt. I think they are a good idea and all cars should include them. I'm against anti-seat belt laws though. What I want to do in my own car is my own business. And if I don't wear my seatbelt, it doesn't harm anyone other than myself. I think they are a great idea, but I think enforcement of wearing them is over the line.
I agree 100%. Laws should not protect us from ourselves.
 
Crazy happens once in a million interactions, but crazy gets remembered and posted, therefore frequency bias makes it seem like a common occurrence.

In theory but I would argue that these days the traffic ticket revenuers have made the crazy far more common than previously thought. All due to dash cams, cell phone cams, and body cams.
 
From the snippets I have seen that seem to show Tesla's ai in action tagging all manner of things at high speed... Im of the opinion that this is nonsense. If you can imbue the opposite, an ai that knows what's irrelevant and acts accordingly in context... Maybe... Tagging everything all the time won't solve squat, there won't be any self driving any time soon. 20 more years, maybe. We will have many years of crap , and semi crap that's for sure. But fsd is fantasy for at least 20y , maybe more.
 
From the snippets I have seen that seem to show Tesla's ai in action tagging all manner of things at high speed... Im of the opinion that this is nonsense. If you can imbue the opposite, an ai that knows what's irrelevant and acts accordingly in context... Maybe... Tagging everything all the time won't solve squat, there won't be any self driving any time soon. 20 more years, maybe. We will have many years of crap , and semi crap that's for sure. But fsd is fantasy for at least 20y , maybe more.
Yea.... I have to disagree. Compute ability in 20 years will make what we have today seem... archaic. New ways and new technologies. Not to mention the capacity to ingest information and make decisions based on that even if it is a super layered decision tree that is forever expanded upon. I expect we are 10 years away from having desktop computers or laptops with Storage so fast it's effectively ram. So having a laptop with 1-2 terabytes of memory will be par for the course. Compute will also I think grow and with the need for decision tree processing becoming more and more impactful I suspect we will see compute able to out pace software for a long time to come.

Really the end result will be performance that is utterly next level and I don't see us needing 20 years to put that reliably into a vehicle with the requisite processing capability to safely navigate terrestrial transportation. Be that flight or driving.
 
Yea.... I have to disagree. Compute ability in 20 years will make what we have today seem... archaic. New ways and new technologies. Not to mention the capacity to ingest information and make decisions based on that even if it is a super layered decision tree that is forever expanded upon. I expect we are 10 years away from having desktop computers or laptops with Storage so fast it's effectively ram. So having a laptop with 1-2 terabytes of memory will be par for the course. Compute will also I think grow and with the need for decision tree processing becoming more and more impactful I suspect we will see compute able to out pace software for a long time to come.

Really the end result will be performance that is utterly next level and I don't see us needing 20 years to put that reliably into a vehicle with the requisite processing capability to safely navigate terrestrial transportation. Be that flight or driving.
I think you're smoking some really good stuff.

We're not even close to any of that. And to think that self driving cars will be practical in less than two decades is a pipe dream. The software for it isn't even remotely possible at this time much less the processing hardware. That hardware has to be in effectively every car and that's not happening anytime soon just due to cost. Then you have to add in all sorts of sensors and the hardware and software required to run it. Then you have to make sure the software is effectively bug free. Then you need to take into consideration bad sensors or out of alignment sensors and the costs associated with that. There's also the software corruption issue because what you would need for self driving is orders of magnitude more complex than anything we have now. And this doesn't take into account the car to car information sharing people want. That's probably a couple of orders of magnitude more complex and would require huge amounts of extra processing power. But this isn't taking into account bad sensors or anything else in the other cars. Garbage In/Garbage Out.

You're also not taking into account the legal issues. Liability is a huge issue. There's also the need to figure out who or what is more important in the decision tree. Are the cars going to run over a class of 3rd graders to avoid running down a politician?

These issues are just the tip of the iceberg.
 
I agree 100%. Laws should not protect us from ourselves.
While that is already open for debate, you not wearing a seatbelt can affect more people then just you as you may end up as a human cannonball.
 
While that is already open for debate, you not wearing a seatbelt can affect more people then just you as you may end up as a human cannonball.
I was not aware that humans cannoning into other humans had a statistically significant impact on injury rates. Usually once you have expended most of your energy getting shredded on the way out the windshield you don't have a whole lot of kinetic energy left to impart.

And, I would point out, if someone would want to bring up the example of a convertible that counters my someone hyperbolic example.. well, I can point to motorcycles, that for some reason, don't require seat belts, and in fact in many states require you to wear a helmet, which makes you even more cannonball-like.

While I concede it's certainly possible, there are a lot of things that are "possible" but just don't really seem to be an issue worth impeding personal rights, agency and autonomy over.
 
I was not aware that humans cannoning into other humans had a statistically significant impact on injury rates. Usually once you have expended most of your energy getting shredded on the way out the windshield you don't have a whole lot of kinetic energy left to impart.

And, I would point out, if someone would want to bring up the example of a convertible that counters my someone hyperbolic example.. well, I can point to motorcycles, that for some reason, don't require seat belts, and in fact in many states require you to wear a helmet, which makes you even more cannonball-like.

While I concede it's certainly possible, there are a lot of things that are "possible" but just don't really seem to be an issue worth impeding personal rights, agency and autonomy over.

I don't want to go to deep into this, but you don't need to go trough your windshield to harm others in a collision, you can just as well hurt whoever is in the car with you, I just meant you are a possible projectile that can inflict harm unto others when not wearing a seatbelt.

I don't realy have a problem by people getting themselves killed by their own choice or sheer stupidity as long as they don't harm others doing so.
 
While that is already open for debate,
The core principle of personal liberty is absolutely _not_ open for debate with *me*. There are cases in which the safety and welfare of the public may appear in conflict with the rights of the individual, but that is a different matter entirely.
you not wearing a seatbelt can affect more people then just you as you may end up as a human cannonball.
Seatbelts are designed to protect the wearer, and that's why the laws exist. If cannonballs are a serious concern, then you'll want to secure the non-human ones too, which could include both pets and inanimate objects.

For the record, I'm not opposed to seatbelts, and acknowledge the safety benefits of wearing them. That isn't the issue.
 
Become a Patron!
Back
Top