Do you feel safer with self-driving cars on the road?

Click For Summary
The discussion centers on the safety perceptions of self-driving cars compared to human drivers. Participants express skepticism about the current capabilities of AI in anticipating complex driving situations, emphasizing that while self-driving cars may statistically reduce accidents, they are not yet widespread enough to enhance overall safety. Concerns are raised about the limitations of sensors and the unpredictability of human behavior, which can lead to accidents that AI may not effectively manage. Some participants look forward to future advancements in self-driving technology, believing that with time, these vehicles could significantly improve road safety. Ultimately, the consensus leans towards cautious optimism, with many agreeing that while self-driving cars may be safer in theory, they do not yet feel comfortable relying on them.

Do you feel safer with self-driving cars on the road?

  • Yes

    Votes: 31 41.3%
  • No

    Votes: 37 49.3%
  • No opinion

    Votes: 7 9.3%

  • Total voters
    75
  • #61
jack action said:
Something in the order of 99.99...% and more.

More then 30,000 deaths and probably hundreds of thousands of injuries last year.

700px-Motor_vehicle_deaths_in_the_US.svg.png
That is a lot of pain and suffering.
 
  • Like
Likes ISamson, russ_watters, Stavros Kiri and 1 other person
Computer science news on Phys.org
  • #62
jack action said:
To me that most likely means that - statistically - people make good decisions regardless of one's opinion on «what could've happen». The reality is that it doesn't.
That is an illusion based on most potentially fatal mistakes actually not being fatal. That you arrive safely is correlated, but not equivalent to you driving safely and taking good decisions. For example, failing to pay proper attention when driving across a seldomly crossed zebra crossing. This is a mistake that is going to go by completely unnoticed until it doesn’t. It does not make the mistake any less of a mistake.

jack action said:
You are also introducing that self-doubt I was talking about. Regardless of what seem to be a good driving record for yourself, you seem to still prefer not making decisions, leaving it to a train operator or AI. What else do you think others can do better than you? Where will you stop? Because I can assure you that there is always someone that can do things better than you, probably even in your field of expertise.
What you describe I would describe as severe hubris. There is also a clear difference in drawing lines. It makes sense for me to continue doing what I do because I add (at least in some part) to the research in my field. Are there people better than me? Sure, but they cannot do everything themselves. This is clearly not the case with autonomous vehicles. Or with cars in general - you do not need more than one driver (or zero in the case of the autonomous car). More drivers will not make the car safer or accomplish its task better.

When it comes to governing, I believe the last few years have clearly shown that a main flaw in democracy is that people are easy to influence with feelings and emotions based on false or invented facts. If sufficiently advanced and benevolent, I would be prepared to handle government over to an AI.

You are talking about people handing over their decisions to a machine, removing their own decisions, but the truth is that many people already hand many of their decisions over to others. The only difference being that these others happen to be other humans (most of the time).

jack action said:
It might sounds cliché, but the only way you know there are good decisions, it's because there are bad decisions.
I think this is nonsense to be honest. The way that you weight decisions is by predicting and weighting outcomes against each other. You do not need to do it to know that hitting on 20 when the dealer shows a 6 is a bad decision.

jack action said:
But do it for the right reasons.
I would argue that saving human lives is a good reason. In particular if it only comes at the expense of humans taking monotonous decisions prone to error. The car is not deciding where you should go. It is removing a monotonous task that most peoples’ brains struggle with.
 
  • Like
Likes russ_watters and Stavros Kiri
  • #63
Elevators / lifts, generally are considered to be better than walking up a ten story building.
 
  • #64
Greg Bernhardt said:
It was a drunk driver going 70 through a red in a 35mph road. Would a machine allow that?
Only if it malfunctioned. Could it happen? Yes (and it will happen), but people "malfunction" a lot more often! ... [if not every day, all the time - some people, at least]
Stop sign and red light violations are almost more often than the non-violations! Then there is speeding and drunk driving ...
[I had some good video links, showing statistics - if I can find them]

Machines normally wouldn't do any of that. Codes are explicit.
[That alone drops down the chances for accidents at least by 90%, I think ...]
 
  • Like
Likes Orodruin, ISamson and rootone
  • #65
Code malfunctions can be corrected, drunk drivers not so easy,
 
  • Like
Likes ISamson and Stavros Kiri
  • #66
jack action said:
This a very pessimistic view based on irrational fear. I'm pretty sure that most people that take the road on a single day arrive at their destination without any bad event whatsoever. Something in the order of 99.99...% and more. To me that most likely means that - statistically - people make good decisions regardless of one's opinion on «what could've happen». The reality is that it doesn't.
You have a good point there, and in that whole post of yours! I think it's a very wise post, although I vote for self-driving cars, for various reasons.
But I think one has to also see the statistics for accidents that do happen (on a day) versus the number of mistakes and violations (huge!, on the said day) that did or did not cause an accident. You in fact want to eliminate all that, regardless of the 99.99...% that you correctly perhaps refer to. And machines almost do that.
 
  • #67
jack action said:
@Greg Bernhardt , @Orodruin , @BillTre :

Greg's post evoked in me a feeling that I expressed before on this forum and I just realized it was in this particular thread, 3 weeks ago, in post #29.

What are the consequences of a human being not making any decision? What is the point of living?

I know the subject of this thread is self-driving cars, but it seems nobody wants to make decisions anymore. A way to disempower oneself and others. Sure, it looks nice when you look at people making bad decisions. But how are we suppose to learn how to differentiate good from bad, if we make less and less decisions? How will we know if the machine made the right decision, if we don't even develop our own judgement? Are we going to doubt ourselves all the time? Should AI decide who will run the country? After all, it will probably make a better decision than the average voter, right? When will we end this journey where we say: «People shouldn't be allowed to do that»?
I really think you are overthinking this. Mostly what we want machines to do for us are the things that are too hard or boring or dangerous. It doesn't get in the way of us living, it frees us to do the living we really want to do.
It's really the «Bring on autonomous cars ASAP!» comment that bothers me. I don't believe it is the Holy Grail. I'm not even sure the problem it supposes to solve is that big of a problem. Most people don't make bad decisions when behind the wheel. It is even far from being a major cause of death for the human race. Statistics, right?
No, actually it really is a very significant cause of death for humans in developed countries like the US. It's higher than 1% overall and depending on your demographic, can be very much higher than 1%.
https://www.cdc.gov/injury/wisqars/overview/key_data.html

Cancer and heart disease are far and away the most significant risks of death, but since they almost exclusively happen to old people, for every other age group except newborns, "unintentional injury" is the leading cause of death, with car accidents making up the largest fraction of that (from above link).
https://www.cdc.gov/injury/images/lc-charts/leading_causes_of_death_age_group_2015_1050w740h.gif

However, narrowly there is a potential salient point here:
I'm more afraid of people loosing their ability to make good decisions in general than the death toll caused by car accidents due to bad decisions.
This is indeed a potential downside and does happen due to too much reliance on automation. Many plane crashes (example: Air France 447) happen because of over-reliance on automation causing pilots to lose their skills or mis-perceive what the computers are telling them. At the same time, one can imagine the increase in self-driving cars to correlate to an increase in alcohol abuse and alcoholism, since removing the need to drive home removes one incentive to behave responsibly. These unintended consequences may be hard to identify, but that's largely because they are much less common/significant than the primary consequence (the increased safety). So whereas today automation failures cause a much more significant fraction of plane crashes today than they used to, overall there are far fewer plane crashes and fewer resulting deaths. The same positive trade-off will almost certainly be true of self-driving cars.
I'm pretty sure that most people that take the road on a single day arrive at their destination without any bad event whatsoever. Something in the order of 99.99...% and more. To me that most likely means that - statistically - people make good decisions regardless of one's opinion on «what could've happen».
This is an improper way to look at the statistics: you are ignoring how often you play the game. Your chances of winning the lottery might be 1 in a million, but if you buy half a million lottery tickets, your chances of winning are 50%. In other words, your individual odds of dying on any particular car ride are very small, but you take a lot of car rides, so your annual or lifetime risk is fairly significant.
It might sounds cliché, but the only way you know there are good decisions, it's because there are bad decisions.
Well that's just silly. You don't need to be a genius to know that running a red light is dumb/dangerous and you don't need to test it either. I don't need to actually [chooses random object in field of view] pull a curtain rod off my wall and stab myself with it to know that would be a dumb thing to do. Humans are plenty smart enough to weigh decisions they have never taken.
 
Last edited:
  • Like
Likes Stavros Kiri
  • #68
Orodruin said:
That is an illusion based on most potentially fatal mistakes actually not being fatal. That you arrive safely is correlated, but not equivalent to you driving safely and taking good decisions. For example, failing to pay proper attention when driving across a seldomly crossed zebra crossing. This is a mistake that is going to go by completely unnoticed until it doesn’t. It does not make the mistake any less of a mistake.
I live in a city where there is a zoo. About 50 years ago, a lion escaped and spread terror to the point where it was shot to death. There is still a zoo today, they still have lions. Is it a mistake on my part to not check for lions on my porch before getting out of my house?

Making decisions is all about probability and, yes, not paying attention when driving across a seldomly crossed zebra crossing is NOT a mistake, from my point of view. The proof lies in the results. But the chances are always there and the hit is inevitable given time. That is why I say about that accident «That person was unlucky, it usually doesn't happen» and not to the thousands of other people who did not have an accident «You were lucky, you could've hit a zebra!» I can assure you that this is how AI would make its decisions as well.
Orodruin said:
When it comes to governing, I believe the last few years have clearly shown that a main flaw in democracy is that people are easy to influence with feelings and emotions based on false or invented facts. If sufficiently advanced and benevolent, I would be prepared to handle government over to an AI.
That is scary. The solution to that problem is to raise people that can make better decisions, not to replace them with machines. Is your solution to people not well educated, replacing them with machines that have better knowledge? Humans are NOT a lost cause. Otherwise there is no point keeping humans alive.
Orodruin said:
You are talking about people handing over their decisions to a machine, removing their own decisions, but the truth is that many people already hand many of their decisions over to others. The only difference being that these others happen to be other humans (most of the time).
That is my point, we are on a dangerous path. One where the common man is seen as an unfit animal, unable to care for itself. I don't believe that. I always felt that we should go towards having more people being able to make decisions in all their life aspects and thus contributing to the society in general, not just waiting for someone (or something) else to decide. That is what democracy is.
Orodruin said:
The way that you weight decisions is by predicting and weighting outcomes against each other. You do not need to do it to know that hitting on 20 when the dealer shows a 6 is a bad decision.
Making decisions is often way more complicated than that. Ask the people of Florida if they should evacuate or not when an hurricane is announced. Not an easy decision to make. How many times are you going to evacuate the entire state «for nothing» before you won't? And when you won't do it, it may be the time you should have. Welcome to life. Can AI do better? I don't think so. The way nature does it is by diversity: Some go, some stay, at least one group survives. The «good» decision is unpredictable.
Orodruin said:
I would argue that saving human lives is a good reason.
No lives are ever saved. The best you can do is extend one. On the greater scheme of things, I still fail to see what improvement it does to a form of life, human race or any other. I guess it is these «feelings and emotions» that you were talking about that influences you. I wonder if you would appreciate a machine making decisions for you with that cold and objective attitude? After all, I'm a human being and I already have those thoughts. You better hope I won't be the programmer behind the next generation of AI.
russ_watters said:
Mostly what we want machines to do for us are the things that are too hard or boring or dangerous. It doesn't get in the way of us living, it frees us to do the living we really want to do.
Oh! I love this one! What is it we really want to do? What if what I really want to do is driving? Will I be allowed or will I be forbidden to do it because it is considered too dangerous by many, too afraid I will kill them? Will I be considered abnormal for thinking driving is fun and not boring? What I'm I supposed to really want to do?

Have you noticed that the more we find things boring, the more we find the things we replaced them with, boring as well?
 
  • #69
russ_watters said:
But heck, I bet the first few times the engineers ran simulations of the accident after the fact, the computer reported to them that no accident happened.

Star Trek - Season 4 Ep 5 "Remember Me"

Tesla, what was that noise?
Explosive decomposition of passenger compartment and passenger.
Cause?
Design Flaw: there appears to be no structure above the mirrors.
 
  • Like
Likes Stavros Kiri and russ_watters
  • #70
jack action said:
Oh! I love this one! What is it we really want to do? What if what I really want to do is driving?
Is that true or are you just asking for the sake of argument? We're not discussing an abstract hypothetical here, we're discussing a likely near-term reality about a daily task people do that most would rather not. Really. As @Greg Bernhardt said earlier, I can't wait for the time when I don't have to do the mind-numbing task of driving to and from work in traffic for an hour+ a day. I'd much rather be taking a nap, watching TV, reading a book, using PF, etc.
Will I be allowed or will I be forbidden to do it because it is considered too dangerous by many, too afraid I will kill them?
1. There is a wide gulf between "possible" and "mandatory". For a very long time - perhaps until after we're both dead and perhaps forever - self-driving will be a feature we can choose to buy and use or not. Like a washing machine. I would bet money that most who can afford it will choose to buy it (and get rewarded with lower insurance rates). So I don't think the scenario you present (of self-driving being mandatory) is a reflection of any potential reality that is on the table.
2. I like driving too. Sometimes. In certain contexts. So perhaps instead of commuting to work (boring, stressful), I'll go to a test track on weekends and scratch the itch that way.
Will I be considered abnormal for thinking driving is fun and not boring? What I'm I supposed to really want to do?
That's meaningless and irrelevant here. Mature, free thinking adults don't have to ask if what they want to do is considered "abnormal". The question is simply what do you and most people want.
Have you noticed that the more we find things boring, the more we find the things we replaced them with, boring as well?
No, I haven't. That's just silly. Why would I choose to do something more boring if there was something less boring and more enjoyable I could do? I don't go thinking to myself: "Hmm...should a watch a movie right now or not turn on the TV and stare at the blank screen?"
 
  • Like
Likes Orodruin
  • #71
jack action said:
Making decisions is all about probability and, yes, not paying attention when driving across a seldomly crossed zebra crossing is NOT a mistake, from my point of view.
This is just silly. Lions do not cause tens of thousands of deaths in the US every year. Besides, regardless of whether you consider it a mistake or not, it is actions like that that cause accidents and autonomous cars will not make them to the same extent as human drivers. (Add to that the limited cognitive abilities of humans, such as only beig able to see in one direction at a time.)

jack action said:
No lives are ever saved. The best you can do is extend one. On the greater scheme of things, I still fail to see what improvement it does to a form of life, human race or any other.
Cute. Let's stop using medicine and surgery, it does not save lives anyway. Would you accept your doctor saying ”this cyst will kill you if left untreated, but you will die from something else otherwise so we wont”. Now who is the cynic here?
 
  • Like
Likes ISamson
  • #72
jack action said:
It's really the «Bring on autonomous cars ASAP!» comment that bothers me. I don't believe it is the Holy Grail. I'm not even sure the problem it supposes to solve is that big of a problem. Most people don't make bad decisions when behind the wheel. It is even far from being a major cause of death for the human race. Statistics, right
That's the problem isn't it.
 
  • #73
Greg Bernhardt said:
A mother and her 11 month old child were t-boned and killed at an intersection a couple blocks from me yesterday.
Orodruin said:
The obvious point being made was that it would not have happened if the car was autonomous.
The obvious point being made is not obvious at all...
Unless, what you actually meant was...

It would not have happened if the mother's car was autonomous, and could have recognize the fact...
Greg Bernhardt said:
It was a drunk driver going 70 through a red in a 35mph road.
... and would have stopped her car, before the intersection collision.

Yes...
Greg Bernhardt said:
Would a machine allow that?
Both... a drunk driver going 70 through a red in a 35mph road, and stopping another car before an intersection collision, even if a light were green...

Every body seems to be obsessed with the drunk driver here, but remember... it's a "two-way street".
 
  • Like
Likes Greg Bernhardt
  • #74
Brilliant... . :thumbup:
DaveC426913 said:
Star Trek - Season 4 Ep 5 "Remember Me"

Tesla, what was that noise?
Explosive decomposition of passenger compartment and passenger.
Cause?
Design Flaw: there appears to be no structure above the mirrors.
 
  • Like
Likes Stavros Kiri
  • #75
OCR said:
Every body seems to be obsessed with the drunk driver here, but remember... it's a "two-way street".
I believe you are wrong. It would have been sufficient for the drunk's car to be autonomous. Replacing that car would have been sufficient to avoid the accident. Of course, if you replace all cars by autonomous ones, also the mother's would have been.

What would happen if you just replaced the mother's car is a different question. Would the car be able to avoid the collision? I believe the answer to that is that it would be more likely to avoid the collision than a human driver. Autonomous cars are taught to recognise and act on threats that are out of the ordinary (such as a driver running a red light) as well as obeying traffic rules themselves. The advantage in favour of the autonomous car comes from its ability to perceive the entire traffic situation, not just what they happen to be looking at.
 
  • Like
Likes Stavros Kiri and russ_watters
  • #76
I think some here are in danger of committing the Nirvana fallacy - throwing out the whole solution because it is not perfect.

There are surely some specious scenarios that an automated system won't be able to handle. Some of these scenarios won't be handled much better by humans.

But the key is that the vast majority of accidents that have been occurring in real life situations are avoidable by an automated system better than by a human.

OK, granted that has yet to be born out by statistics, but still, let's not make the mistake of being penny-wise but pound-foolish when it comes to accidents averted and lives saved.
 
  • Like
Likes Stavros Kiri, russ_watters and Orodruin
  • #77
I'd also argue that is is not so much about people making "mistakes"; it is more properly that the accepted pace of driving has exceeded our reaction-times.

As everyone is aware, at 70mph, a car covers 100 feet every second. Add to that perception delay, decision and reaction delay.

There's no way humans can avoid fatal accidents at this pace - even if they make no mistakes. AIs can.
 
  • #78
I voted "yes". I would like very much to vote "no", if there was a single chance that people would eventually start to respect the driving rules and take the right decisions at the right time when driving and so not let the technology take the power off our hands. But as we all know, this is impossible. Emotions are a great thing that we all humans have but - at least in most cases, not when it comes to driving. Talking specifically about my country, I think that there is a whole bunch of things to be done in order for the driving habits to get at a decent level. I'm a motorcyclist and I really have seen many poor decisions on the road for over thirty five years. Of course, I'm not perfect too - nobody is, but I can frankly say that I respect the rules and as I am in a way more vulnerable position comparing to a car, I always try to foresee what the next car's driver or the one behind me is intending to do and a whole lot of other "what if" things. So, I think that self driving cars would be a really important thing but - as has been already mentioned several times in the thread, there's a lot of work to be done in order to make me feel safer. But I believe that this time will definitely come.
 
  • Like
Likes Stavros Kiri
  • #79
QuantumQuest said:
... if there was a single chance that people would eventually start to respect the driving rules and take the right decisions at the right time when driving...
Problem is, it's worse than that.

The skill required for driving will only go up, as speeds increase, traffic gets ever heavier, and more roads are laid.
So, even if drivers suddenly started operating at required levels today, it's a losing battle, as it's a moving target.
It is already exceeding our current abilities, and its only going to require more each passing year.I'm a die-hard driver. So I'm a dinosaur. But I do see a day when traffic will ease, as more and more personal cars are replaced on our roads with a combination of automated cars, bicycles and better mass transit solutions (such as taxi-like services).
 
  • Like
Likes QuantumQuest
  • #80
russ_watters said:
No, I haven't. That's just silly. Why would I choose to do something more boring if there was something less boring and more enjoyable I could do? I don't go thinking to myself: "Hmm...should a watch a movie right now or not turn on the TV and stare at the blank screen?"
I think he* probably means that it is an inclination (tedency) from our inside (or something, etc.), if we are kind of saturated and we find more and more things boring ...

* I am referring to @jack action

(
jack action said:
Have you noticed that the more we find things boring, the more we find the things we replaced them with, boring as well?
)
 
Last edited:
  • Like
Likes jack action
  • #81
russ_watters said:
Really. As @Greg Bernhardt said earlier, I can't wait for the time when I don't have to do the mind-numbing task of driving to and from work in traffic for an hour+ a day. I'd much rather be taking a nap, watching TV, reading a book, using PF, etc.
russ_watters said:
... using PF ...
:thumbup::smile:

+ time is valuable, although driving is good too, but traffic jams etc. are just a waste of time! ...
Autonomous cars are perfect for time-overlapping, taking care of tasks and business, as well as science and PF, while driving, ... We'll have more PF posts then! ...
 
  • #83
Ivan Samsonov said:
https://phys.org/news/2017-10-self-driving-cars-future-mobility-disabled.html

However driverless might be good for disabled people.
The way humans make mistakes and cause accidents, we all act like disabled sometimes! ... (e.g. when very tired or exhausted) [Of course humans and humanity keep improving! - we are a good promising species! ...]
So good for everyone too(?) ...
 
  • #85
Good thing the company isn't Wham-O!
 
  • Like
Likes Greg Bernhardt
  • #86
Sorry if this is OT and a new post could be made, but, is safety the only issue one should consider re self-driving cars? No one seems to want to bring the ugly side that comes with the glamour of "disrupting" : the people who end up displaced. What about taxi drivers, bus drivers, etc?
 
  • #87
It's the truck drivers fault by law but if a human driver did what the Driverless Shuttle did we would call him a brainless deer in the headlights idiot from not listening to the passengers screaming for him to avoid the backing truck.

https://www.huffingtonpost.com/entry/driverless-shuttle-hit-by-truck_us_5a0371bfe4b03deac08af3db
Passengers onboard the vehicle confirmed that the shuttle did stop ― but in the path of the truck.

“The shuttle just stayed still and we were like, ‘Oh my gosh, it’s going to hit us, it’s going to hit us!’ and then, it hit us!” passenger Jenny Wong told KSNV. “And the shuttle didn’t have the ability to move back, either. Like, the shuttle just stayed still.”
 
  • #88
WWGD said:
No one seems to want to bring the ugly side that comes with the glamour of "disrupting" : the people who end up displaced. What about taxi drivers, bus drivers, etc?
I am always confused by this sentiment. Every new innovation threatens the old way. Do we stop progressing?
 
  • Like
Likes Stavros Kiri and russ_watters
  • #89
Greg Bernhardt said:
I am always confused by this sentiment. Every new innovation threatens the old way. Do we stop progressing?

Yes, but now jobs are being disrupted at an ever increasing rate. This will not be the same scale of job disruption as in the past. I see the potential for a serious crap storm in the near future if steps are not taken.
 
  • #90
WWGD said:
Sorry if this is OT and a new post could be made, but, is safety the only issue one should consider re self-driving cars? No one seems to want to bring the ugly side that comes with the glamour of "disrupting" : the people who end up displaced. What about taxi drivers, bus drivers, etc?
The fact that a taxi driver won't be able to drive a taxi anymore only means he will have more time to find a cure for cancer or increase the efficiency of solar panels.

If a person lives, it must create a demand for something. If there is a demand, there is work. That is why technology or immigration will never create a job shortage. Ever.

But I'll admit that it can be difficult to change old habits and break a well established routine.
 
  • Like
Likes Stavros Kiri

Similar threads

  • · Replies 4 ·
Replies
4
Views
2K
Replies
10
Views
5K
  • · Replies 28 ·
Replies
28
Views
5K
Replies
7
Views
6K
  • · Replies 2 ·
Replies
2
Views
4K
  • · Replies 19 ·
Replies
19
Views
12K
Replies
67
Views
16K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 17 ·
Replies
17
Views
4K