Tesla starts "full self-driving" beta test - follow-up

  • Thread starter jack action
  • Start date
  • Tags
    Tesla
  • #1
jack action
Science Advisor
Insights Author
Gold Member
2023 Award
3,106
7,678
The following thread was opened 3 years ago and is now closed to further replies:

https://www.physicsforums.com/threads/tesla-starts-full-self-driving-beta-test.995228/

There is now a new development and it seems to still divide people:

https://www.topgear.com/car-news/usa/two-million-teslas-recalled-us-over-insufficient-autopilot said:

Two million Teslas recalled in the US over 'insufficient' Autopilot​


Around two million Teslas - including the Model S, Model X, Model 3 and Model Y - are being recalled in the US after the National Highway Traffic Safety Administration (NHTSA) concluded the firm’s divisive Autopilot feature was “insufficient to prevent misuse". Oh dear.

It follows a two-year investigation into nearly 1,000 crashes involving Tesla vehicles, and is the second recall this year after the firm was forced to make changes to its Full Self-Driving software.

This week Tesla has defended its systems in a lengthy post on X (RIP Twitter), largely in response to an article written by the Washington Post. Elon Musk’s company has agreed to issue an over-the-air update to address the NHTSA’s concerns.

“In certain circumstances when Autosteer is engaged, and the driver does not maintain responsibility for vehicle operation and is unprepared to intervene as necessary or fails to recognize when Autosteer is canceled or not engaged, there may be an increased risk of a crash,” read the recall notice on the organisation’s website.

“In certain circumstances when Autosteer is engaged, the prominence and scope of the feature’s controls may not be sufficient to prevent driver misuse of the SAE Level 2 advanced driver-assistance feature.”

Meanwhile Tesla claims that its data from the final quarter of 2022 shows one crash for every 4.85 million miles driven with Autopilot engaged, compared to 1.4 million miles without it.

Tesla said: “The data is clear: the more automation technology offered to support the driver, the safer the driver and other road users.”

Not to get all tribal... but whose side are you on?
So after 3 years of testing, how are we seeing self-driving cars now?

@mfb @DaveE @berkeman
 
  • Sad
Likes PeroK
Engineering news on Phys.org
  • #2
@PeroK : Are you sad because NHTSA is throwing a wrench in the works of Tesla or because Tesla is downplaying the problem?
 
  • Like
Likes russ_watters
  • #3
jack action said:
@PeroK : Are you sad because NHTSA is throwing a wrench in the works of Tesla or because Tesla is downplaying the problem?
I'm sad because many lives are lost in road accidents and eventually self-driving vehicles will solve that problem. The longer the technology takes to develop, the more people will be killed or injured in the meantime.

That said, this recall has to do with "autopilot" software that is intended as driver assistance only. As fas as I can see, all these accidents were caused by driver complacency. The name is misleading:

Transportation Secretary Pete Buttigieg recently stated in an interview with the Associated Press that he believes the name is misleading. "I don't think that something should be called, for example, an Autopilot, when the fine print says you need to have your hands on the wheel and eyes on the road at all times," he said.
 
  • #4
PeroK said:
Transportation Secretary Pete Buttigieg recently stated in an interview with the Associated Press that he believes the name is misleading. "I don't think that something should be called, for example, an Autopilot, when the fine print says you need to have your hands on the wheel and eyes on the road at all times," he said.
No politician has ever done something like that.
 
  • #5
Frabjous said:
No politician has ever done something like that.
They are an evil lot, politicians. Just think how much better it would be if - for example - we could vote for politicians and get the ones we wanted.
 
  • Haha
  • Like
Likes hutchphd and russ_watters
  • #6
PeroK said:
They are an evil lot, politicians. Just think how much better it would be if - for example - we could vote for politicians and get the ones we wanted.
I am not sure I would go that far, but the selection is certainly lacking.
 
  • #7
A couple of thoughts:
Article said:
“In certain circumstances when Autosteer is engaged, and the driver does not maintain responsibility for vehicle operation and is unprepared to intervene as necessary or fails to recognize when Autosteer is canceled or not engaged, there may be an increased risk of a crash,” read the recall notice on the organisation’s website.

Meanwhile Tesla claims that its data from the final quarter of 2022 shows one crash for every 4.85 million miles driven with Autopilot engaged, compared to 1.4 million miles without it.

Tesla said: “The data is clear: the more automation technology offered to support the driver, the safer the driver and other road users.”
First, when read precisely, Tesla's response appears to me to be intentional misdirection/misleading. By a strict reading of their wording, many of the 1,000 crashes investigated by the NHTSA would not be included in their "with Autopilot engaged" stat if the autopilot disengaged moments before the crash, which appears to be the problem the NHTSA was concerned with.

Second, if the autopilot only operates in easy driving conditions, just being 3x "safer" would not be particularly impressive. One would need to compare accident rates in like-for-like situations.


PeroK said:
That said, this recall has to do with "autopilot" software that is intended as driver assistance only. As fas as I can see, all these accidents were caused by driver complacency. The name is misleading:
That's true, but: 1) Tesla created the name to be misleading on purpose. 2) Driver complacency is caused by the technology (the name likely contributing) and therefore the responsibility of the company/technology to prevent.

The end result of #2 might be that due to human limitations, SAE Level 2 and 3 automation may not be possible to implement at an acceptable level of safety. The aviation industry is to some extent in that valley, with autopilot's being about Level 3 in my estimation and being a contributing factor in some plane crashes.

Commercial pilots are of course highly trained and paid to act a certain way in the cockpit that should help prevent automation interaction problems. But in Aeroflot 593 the pilot let his kids manipulate the controls because he thought the autopilot would stay in control. But it's possible to literally wrestle the control away from the autopilot, and the 15 year old boy was strong enough to do just that. Not the sort of thing you'd expect from a highly trained pilot (well....Aeroflot, so...) but something you absolutely have to expect from a car driver.

Either way, there's been a lot of hype about AI lately that hasn't panned out. Elon Musk was very wrong about how easy self-driving would be to implement. It's anybody's guess how long it's going to take to get to an acceptable level of safety by itself and in human-machine integration.
 
  • Like
  • Informative
Likes nsaspook, Vanadium 50, jack action and 1 other person
  • #9
russ_watters said:
well....Aeroflot, so
Is evil capitalist pig-dog lie! (The Russioan government denied this...until the CVR transcript was published.)

russ_watters said:
if the autopilot disengaged moments before the crash
It's like the old joke? "Maybe I should replace my tires. But how far will they take me if I leave them on?" "All the way to the scene of the accident."

Telsa seems to want to have it both ways. "Our cars are seld-driving! Well, unless something goes wrong and then its your fault." I understand why they want this. I just doubt they will be able to get it.

A self-driving car that requires the full attention of the driver is not much help. I might even argue it is worse.
 
  • Like
Likes russ_watters, Mondayman and jack action
  • #10
It's a software update, not a recall. Do we write "Apple recalls 1 billion phones" every time they release a new OS?
(NHTSA) concluded the firm’s divisive Autopilot feature was “insufficient to prevent misuse"
Guess we need to ban all cars then. Every single car can be misused.

A driving assistance that requires the full attention of the driver is still reducing accidents compared to unassisted driving that requires the full attention of the driver. I think that - and only that - should be the metric to look at. If it reduces injuries and deaths, it's good. And it's going to be even better in the future.

I don't think the name "autopilot" really matters. Some drivers would rely too more on it than they should no matter how it's called, people would write the same articles about it. Tesla is one of the most-shorted stocks despite a sky-high valuation, writing bad articles about the company is worth billions for some people. And it's so easy! "1000 crashes" sounds alarming, if you ignore that the typical rate would have been 3000 crashes or whatever the number is.
The Washington Post is owned by Jeff Bezos, who has his own spaceflight company trying to compete with Musk's other large company. It's not coincidence.
 
  • Like
Likes nsaspook and PeroK
  • #11
mfb said:
It's a software update, not a recall.
How is a software bug different from a mechanical "bug"? I would prefer driving a car with, say, a courtesy lamp switch that won't turn off than one with a software bug that, say, may apply the brakes for no good reason.

mfb said:
Guess we need to ban all cars then. Every single car can be misused.
I think the sense of the sentence is “insufficient to prevent [unintentional] misuse".

mfb said:
A driving assistance that requires the full attention of the driver is still reducing accidents compared to unassisted driving that requires the full attention of the driver.
It is clearly the goal and the intention but I don't think it has been proven yet. Other than the car manufacturer's data and analysis, do we have any source on that, for real-world conditions?

For example with the following statement:
Meanwhile Tesla claims that its data from the final quarter of 2022 shows one crash for every 4.85 million miles driven with Autopilot engaged, compared to 1.4 million miles without it.
Are every "miles" equivalent? Say on their way to work people use their Autopilot on the highway for, say, 35 miles, and then turn it off while entering the city where they drive an additional 10 miles; then no matter where the accidents happen, those numbers only mean the Autopilot doesn't make a difference. You have as much chance to have an accident on your way to work, with or without the Autopilot on.

But let's even say that they do 10 miles on the highway with the Autopilot for 10 miles in the city without it. Isn't a mile on the highway easier to monitor for the system than a mile in the city?

I am skeptical of such simplification, especially coming from the manufacturer itself.
 
  • #12
mfb said:
It's a software update, not a recall. Do we write "Apple recalls 1 billion phones" every time they release a new OS?
When ordered by a regulatory agency, it's a recall. It's not a tem just made up by reporters, it's a legal term.
A driving assistance that requires the full attention of the driver is still reducing accidents compared to unassisted driving that requires the full attention of the driver.
That isn't inherently true, for three reasons:
1) You are assuming the feature itself actually works and 2) can be overridden by the driver if it doesn't. One of the more significant failure modes is "phantom braking", which can't be overriden. And 3), like I said, the manufacturer is responsible for the driver using many "assistance" features correctly.
I think that - and only that - should be the metric to look at. If it reduces injuries and deaths, it's good.
That's fine as a position (not sure if I agree), but in this case there's no way to know if that's true because Tesla refuses to share their data. That should be illegal.
I don't think the name "autopilot" really matters.
It'll absolutely matter to the juries.

Tesla is one of the most-shorted stocks despite a sky-high valuation,
High value means a lot of room to drop.
 
Last edited:
  • #13
russ_watters said:
And 3), like I said, the manufacturer is responsible for the driver using many "assistance" features correctly.
About that. According to the wiki article on it, full self driving is claimed to be a Level 5 system in Beta test, that therefore has to be treated as a Level 2 system. That's such a clear contradiction I have no idea how that could even be possible.

https://en.m.wikipedia.org/wiki/Tesla_Autopilot
 
Last edited:
  • #14
jack action said:
I think the sense of the sentence is “insufficient to prevent [unintentional] misuse".
That still applies to every car. Misuse that leads to accidents is almost never intentional.
jack action said:
Other than the car manufacturer's data and analysis, do we have any source on that, for real-world conditions?
I haven't seen that. The fact that no one claims higher accident rates makes me think it's likely reducing accident rates. If there were any hint of higher accident rates you'd see that discussed everywhere, all the time. Instead you get raw accident counts without any context or other misleading numbers.

It looks like Tesla counts all miles driven. It's likely miles on a highway are using the autopilot more often than miles in more difficult driving conditions, but the statistics count accidents where "an airbag or other active restraint deployed" - bumping into a car in a parking lot is unlikely to do that, while almost all accidents on highways are included. I'm not sure which effect wins here. Per distance driven, rural areas have higher fatality rates than urban areas, for example.
russ_watters said:
When ordered by a regulatory agency, it's a recall. It's not a tem just made up by reporters, it's a legal term.
Does that legal term not include calling the product back to the manufacturer? This dictionary thinks so: "a public call by a manufacturer for the return of a defective or esp. unsafe product"
That makes e.g. this Hunyai/Kia fire risk a recall - you should go to your dealer and get it repaired - but no Tesla had to return to the manufacturer here.
russ_watters said:
That isn't inherently true, for three reasons:
1) You are assuming the feature itself actually works and 2) can be overridden by the driver if it doesn't. One of the more significant failure modes is "phantom braking", which can't be overriden. And 3), like I said, the manufacturer is responsible for the driver using many "assistance" features correctly.
For all I can tell, it seems to work well enough to reduce accidents. Requiring a feature to never malfunction is not reasonable. You can override the braking by pressing either pedal.
What do you expect Tesla to do, besides working on reducing the error rate? Remove driver assistance completely? We'll see crashes go up, and people will blame Tesla for that as well. If every possible action besides an impossible 100% flawless system produces negative articles, how much weight should we give to them?
russ_watters said:
High value means a lot of room to drop.
Exactly, that means people can make billions from stories like the WaPo article.
russ_watters said:
By a strict reading of their wording, many of the 1,000 crashes investigated by the NHTSA would not be included in their "with Autopilot engaged" stat if the autopilot disengaged moments before the crash, which appears to be the problem the NHTSA was concerned with.
That would be absurd, and of course Tesla doesn't do that. From their safety report:
To ensure our statistics are conservative, we count any crash in which Autopilot was deactivated within 5 seconds before impact, and we count all crashes in which the incident alert indicated an airbag or other active restraint deployed.
 
  • #15
mfb said:
The fact that no one claims higher accident rates makes me think it's likely reducing accident rates.
What about the fact that no one claims lower accident rates? Combined with the fact that no one claims higher accident rates, wouldn't this mean it most likely does not affect the accident rate?

mfb said:
Remove driver assistance completely? We'll see crashes go up,
Nobody still hasn't shown that they went down with driver assistance; so how can one say they would go up?

mfb said:
Does that legal term not include calling the product back to the manufacturer?
No. The only "legal" definition that counts is the one from NHTSA:
https://www.nhtsa.gov/recalls said:
A recall is issued when a manufacturer or NHTSA determines that a vehicle, equipment, car seat, or tire creates an unreasonable safety risk or fails to meet minimum safety standards. Most decisions to conduct a recall and remedy a safety defect are made voluntarily by manufacturers prior to any involvement by NHTSA.

Manufacturers are required to fix the problem by repairing it, replacing it, offering a refund, or in rare cases repurchasing the vehicle.
"Manufacturers are required to fix the problem" doesn't imply that car owners must go to the dealership.

I had a construction heater once that had a recall for a safety defect. The manufacturer delivered to me a free device to connect directly between the power outlet and the heater. I didn't need to bring my heater anywhere even if it was a "recall".
 
  • Like
Likes russ_watters
  • #16
mfb said:
That still applies to every car. ....The fact that no one claims higher accident rates makes me think it's likely reducing accident rates.
As much of a reach and specious as that is, here's a source claiming Tesla's accident rate is the worst amongst 30 studied brands:
https://www.forbes.com/sites/steveb...ident-rate-of-any-auto-brand/?sh=7f5affd2894d

And again, it's not as simple as you are implying.

mfb said:
Does that legal term not include calling the product back to the manufacturer? This dictionary thinks so: "a public call by a manufacturer for the return of a defective or esp. unsafe product" That makes e.g. this Hunyai/Kia fire risk a recall - you should go to your dealer and get it repaired - but no Tesla had to return to the manufacturer here.
You're trying to split a hair that doesn't exist. It's a regulator-required repair. Sometimes that includes/can be a software fix. Yep, Telsa can do a software fix over air, whereas some need to be done by the dealer/manufacturer at their shop. That doesn't make it not a "recall".

mfb said:
For all I can tell, it seems to work well enough to reduce accidents.
Again, that's speculation and it seems to go against the NHTSA and the source I gave.
mfb said:
Requiring a feature to never malfunction is not reasonable.
Nobody has said that.
mfb said:
You can override the braking by pressing either pedal.
That makes no sense so I'm pretty sure it isn't true. I've driven a two cars with automatic emergency braking and neither of them can be overridden by pushing the gas pedal(that's why the feature exists!)....and pushing the brake would just reinforce/take over for it (cancel by doing the action the feature would do). In fact, that appears to be part of the algorithm for deciding to invoke the feature -- if you're accelerating when you should be braking it can trigger the use. If you're already braking it doesn't get triggered. But you can't cancel the braking and force an accident.
mfb said:
What do you expect Tesla to do, besides working on reducing the error rate?
Not beta test software on the road, but instead test it under closed/sandboxed conditions and have specific features approved by the NHTSA/DOT prior to implementation.

Also: Not do false advertising and not provide drivers inherently self-contradictory features and instructions.
mfb said:
Exactly, that means people can make billions from stories like the WaPo article.
What conspiracy theory are you claiming here? That Jeff Bezos is manufacturing negative press to assist short sellers? As wild as that speculation is, it's not relevant to the engineering issue. Please keep it out of the engineering forum.
That would be absurd, and of course Tesla doesn't do that. From their safety report:
Thanks for finding that....given who/what we're talking about, the absurd can't be assumed to not be the case.
 
Last edited:
  • #17
Reduction in accidents per mile is probably too simplistic a metric. Where will autopilot do best? Stop and go traffic on the highway with slick roads, as after a snowfall. These accidents are common, but usually are fender-benders.

If autopilot reduces those accidents but also increases the number of other, more serious accidents, the total accident rate may well go down, as the cost in dollars, injuries and fatalities goes up.

The argument has been made that ultimately self-driving cars will be safer, and less safety now while we work out the bugs is just the price we have to pay. I don't buy this. Perhaps if this is the only way, but not if it is simply the fastest way or cheapest way.
 
  • Like
Likes berkeman
  • #18
Some additional info that may be useful in the discussion, regarding automatic emergency braking. This is a Level 0 feature, which means it is invisible to the driver except when it activates in a potential emergency. It's a pure safety feature, not an assistance feature.

The two cars I've driven with automatic emergency braking are a 2019 Kia Stinger and 2016(?) Audi A3. Both have specific but different failure modes:

The Audi fails sometimes when parallel parking in the city. A passing car or pedestrian can set it off. My interpretation is that the system does not know the constraints of parallel parking so it doesn't know the extent to which the car can be expected to move. In other words, I'm not going to hit that pedestrian because he's on the sidewalk and I'm not going to back onto the sidewalk. He knows it, I know it, but the automatic emergency braking system doesn't. This failure mode is inherently incapable of causing an accident as far as I can tell. My reaction when it has happened has been confusion and stopping to figure out what's going on --the first time it activated I thought I'd hit something. I don't think I've stomped on the gas pedal in response, but even if I had, my expectation is it's unlikely to have overridden the feature, since that's the whole point.

The Stinger's failure mode is potentially hazardous: it will sometimes activate in response to crossing traffic. Specifically, when a car in front of me is turning right, it will activate because it appears I could hit that car as I'm passing it. My interpretation is that it is one-dimensional: it detects distance and closing speed and probably also gas/brake usage, and activates if the algorithm predicts a crash could occur. But it doesn't recognize if the obstacle is moving perpendicular to the car's movement direction and thus will be out of the way before impact. This could cause an accident if someone is following me close behind, but not likely a serious accident because these situations only occur on local roads at moderate speed.

Tesla's failure, on the other hand, is said to occur at highway speed when a truck passes in the other direction. I suspect it's due to being part of a much more complex self-driving algorithm as opposed to a stand-alone and simple emergency braking feature.
 
  • #19
jack action said:
What about the fact that no one claims lower accident rates?
Tesla does.
russ_watters said:
As much of a reach and specious as that is, here's a source claiming Tesla's accident rate is the worst amongst 30 studied brands:
A study that doesn't consider how far drivers go in these vehicles is useless. And even if it would do that, it wouldn't tell us anything about the question discussed here, if the assistance systems lower or raise the accident rate (and given the attention Tesla gets with even the most minor things, I can't see how the latter could be the case and stay unnoticed).
russ_watters said:
That makes no sense so I'm pretty sure it isn't true.
You incorrectly claimed that Tesla has an autopilot feature that cannot be overridden, you heavily implied that to be a problem. You learned that it can be overridden, and now you don't believe it because you think it wouldn't make sense?
If that's your approach to this discussion - just claiming things because you want them to be true - then there is no point in further discussion with you in this thread.
 
  • #20
russ_watters said:
One of the more significant failure modes is "phantom braking", which can't be overriden.
mfb said:
You can override the braking by pressing either pedal.
After verification, you are both kind of right. According to this source:
https://teslamotorsclub.com/tmc/posts/1725610/ said:
Copied from the 7.1 car manual p. 86 ...

When Automatic Emergency Braking has reduced the driving speed by 25 mph (40 km/h), the brakes are released. For example, if Automatic Emergency Braking applies braking when driving at 56 mph (90 km/h), it releases the brakes when the speed has been reduced to 31 mph (50 km/h).
Automatic Emergency Braking operates only when driving between 5 mph (8 km/h) and 85 mph (140 km/h).
Automatic Emergency Braking does not apply the brakes, or stops applying the brakes, in situations where you are taking action to avoid a potential collision. For example:
• You turn the steering wheel sharply.
• You press the accelerator pedal.
• You press and release the brake pedal.
• A vehicle, motorcycle, bicycle, or pedestrian, is no longer detected ahead.
So the AEB is easily overridden and it will also "override itself" after a 25 mph decrease. The problem is that if you are driving 30 mph and the AEB engages for no good reason, it will only shut off by itself at 5 mph - which is basically a full stop and should happen rather quickly (##\approx## 1.2-1.5 s). Even assuming the driver can react within that time frame, what will he do? He doesn't see any problem ahead and doesn't understand why the car is reacting, so why would he steer one way or the other? Why would he push the accelerator? Pressing the brake pedal to just release it? That seems counterintuitive and requires extra preparedness from the driver for that kind of situation. "Yah, Yah, that's «normal». They all do that." kind of problem. And if your reflexes are trained for that, then you will deactivate the AEB even when it is needed, unintentionally.

It seems clear to me that, practically, no one would - or could - ever override the AEB before a car pile-up happens. The only solution is to have all the following cars also using AEB!

mfb said:
jack action said:
What about the fact that no one claims lower accident rates?
Tesla does.
I bet they do. I'm talking about an unbias tier party.
 
  • #21
Is it an "assistance system" as some call it, an "autopilot" as it is marketed, or "full self-driving" as the thread title suggests?
 
  • #23
Lexus’ Driver Monitoring Is the Only One That Works Right, Says IIHS

The IIHS tested 14 monitoring systems from nine different car companies. Of those, only the Lexus Teammate system—examined on the Lexus LS—earned an Acceptable rating. Two were rated Marginal, while the other 11 got Poor ratings.
“Most of them don’t include adequate measures to prevent misuse and keep drivers from losing focus on what’s happening on the road.”

The issues varied between systems. Some failed to recognize when a driver wasn't watching the road, some weren't aggressive enough in warning distracted drivers, and some didn't warn distracted drivers at all. There were even those that let the driver unbuckle their seatbelt with ADAS in action.

When drivers ignore such distraction warnings while using ADAS, the IIHS feels certain emergency procedures are necessary to prevent crashes. For instance, if a driver is unresponsive to warnings for 35 seconds, cars should initiate a slowdown. If that slowdown procedure is ignored even further, the system should alert emergency personnel and prevent the ADAS from being restarted. The idea is that if drivers keep ignoring warnings, they're either hurt, in distress, or just flat-out misusing the system. However, of all 14 systems, only GM's Super Cruise had all five of the IIHS' desired procedures
“Some drivers may feel that partial automation makes long drives easier, but there is little evidence it makes driving safer,” said Harkey. “As many high-profile crashes have illustrated, it can introduce new risks when systems lack the appropriate safeguards.”
However, Harkey does provide a silver lining: Since none of these systems are good overall and all have their unique advantages, they can be fixed with simple software updates. Until then, though, Lexus is the only brand that monitors its drivers well enough.
 

Similar threads

  • General Discussion
Replies
28
Views
2K
  • General Engineering
Replies
19
Views
10K
  • General Discussion
Replies
1
Views
8K
  • General Discussion
Replies
4
Views
7K
  • General Discussion
2
Replies
65
Views
8K
  • Atomic and Condensed Matter
Replies
4
Views
6K
Replies
3
Views
3K
Back
Top