Do you feel safer with self-driving cars on the road?

Click For Summary
The discussion centers on the safety perceptions of self-driving cars compared to human drivers. Participants express skepticism about the current capabilities of AI in anticipating complex driving situations, emphasizing that while self-driving cars may statistically reduce accidents, they are not yet widespread enough to enhance overall safety. Concerns are raised about the limitations of sensors and the unpredictability of human behavior, which can lead to accidents that AI may not effectively manage. Some participants look forward to future advancements in self-driving technology, believing that with time, these vehicles could significantly improve road safety. Ultimately, the consensus leans towards cautious optimism, with many agreeing that while self-driving cars may be safer in theory, they do not yet feel comfortable relying on them.

Do you feel safer with self-driving cars on the road?

  • Yes

    Votes: 31 41.3%
  • No

    Votes: 37 49.3%
  • No opinion

    Votes: 7 9.3%

  • Total voters
    75
  • #31
Frenemy90210 said:
About thinking ahead : Human can recognize drunk people attempting to cross the road, the machine cant.
I think you'd be surprised what machines can be taught to recognize.
 
  • Like
Likes ISamson and Stavros Kiri
Computer science news on Phys.org
  • #32
What seems to be currently happening is an increase in driver assists. Warnings about objects nearby when changing lanes or backing up. Cars that warn a driver and/or automatically apply the brakes to avoid collisions. Smart cruise control that can slow down to a stop and continue (usually resume is needed if actually stopped).

My wife's car has most of these features. One issue is the lane change warning can get triggered by construction like repaved sections of road of different colors that don't follow the actual lane.
 
  • #33
Stavros Kiri said:
You've never been a passenger?
When I was a kid, I was one all the time and couldn't grow up fast enough to be in the driver seat.

Stavros Kiri said:
You pilot your own plane too?
If I had to use one, I wish I would pilot it!

Stavros Kiri said:
I like all kinds of travelling. Driving is only part ... for most of us anyway. See my point?
I know that I seem to become less and less part of «most of us». I'm questioning how good it is to live in a society built on the fear of «most of us». If my neighbors think that what I do (or don't do) is unsafe and I don't, should I always have to comply to his or her fear? I'm more afraid of that than having a car accident right now.

rcgldr said:
What seems to be currently happening is an increase in driver assists.
That's more acceptable than driverless, IMHO. Although I don't mind people having driverless vehicles if they want one. I just wish that we won't reach a point where that it is our only choice.
 
  • Like
Likes atyy and Stavros Kiri
  • #34
Frenemy90210 said:
About thinking ahead : Human can recognize drunk people attempting to cross the road, the machine cant.

Yes a human is very good at guessing whether it's quite safe to drive 50 km/h past pedestrians standing 1 m from the driveway.

My point is that it's not safe, but humans are doing it all the time. Of course pedestrians are getting killed all the time too.
 
  • #35
jartsa said:
Yes a human is very good at guessing whether it's quite safe to drive 50 km/h past pedestrians standing 1 m from the driveway.

My point is that it's not safe, but humans are doing it all the time. Of course pedestrians are getting killed all the time too.
But I think the standard is: is it safer when humans do it? No method will be completely safe.
 
  • Like
Likes Orodruin
  • #36
jack action said:
Am I the only one who likes driving?
No! I love it too, and I agree with many of the things that you're saying (see ahead), but the main issue is which one is safer, or how we feel ... . I personally feel safer with technology (if the programming is right) [, and I rely on it all the time,] than with humans (including myself). Humans make mistakes more often than machine errors [for which there are also programs to predict and fix] (IMO); they always have [been making mistakes] + they will always will ... . Their (our) behaviour and efficiency is affected by emotions, mood, health factors (including sometimes unforeseeable ones [e.g. such as sudden dizziness, heart attack etc.]), etc.; also there is the big issue of subjective judgement. I am not saying that these are necessarilly bad or negative, but they can get very unsafe many times. That's why basically I voted "Yes" to our poll.
But I see your logic, with which I partially agree, and I liked many of your arguments.
The issue of freedom, initiative and control bothers (concerns) me too, besides the fear issue that you mention in your other post ...
Thus regarding
jack action said:
I know that I seem to become less and less part of «most of us».
Not at all! Don't see it that way. (I don't.) The current poll is well to your favour anyway, as we speak! ...
jack action said:
When I was a kid, I was one all the time and couldn't grow up fast enough to be in the driver seat.
I don't dissagree. Driving is creative. But what I meant was that everyday we have to rely on many types of machines (cars, buses, trains, boats, ships, planes etc.) for our transportation and safety etc., and on other humans too, that we do not have contol over. In other words we can't control everything! Driving is the least.
And as far as piloting, although I would love too to become a pilot one day, right now I am not, and thus when I fly I am just a
jack action said:
piece of meat that gets to be moved around
(similar in other types of public transportation [buses, trains etc.], especially if you live in a metropolitan area)
jack action said:
Although I don't mind people having driverless vehicles if they want one. I just wish that we won't reach a point where that it is our only choice.
Well put! I agree. That is my fear and concern as well (despite my 'possible future projected post' earlier above [#24]). But I doubt that this will ever happen exactly that way (just like e.g. with cell phones - you can avoid having one if you do not wish to, while most people have ...). However, nobody can foresee exactly the future. Only the people that create it can have a better idea! ...

But in any case the main issue here is about safety and our poll (what we think) ...
 
Last edited:
  • Like
Likes jack action
  • #37
So to add another thing to the discussion. I voted "no opinion", mainly because how the question was phrased - in present tense. At the current time, I do not think there are enough self-driving cars to noticably affect safety. In the future, I would assume that they are not allowed on the road en masse unless they work at least as well as the average human driver (low bar, I know), which I think there is a reasonable chance of achieving. This is a matter of regulation - just as it is a matterof regulation which humans we allow to drive on the roads. To be honest, I think any self driving car model would need to go through significantly harder testing than the drivers test you have to do to get your licence.
 
  • Like
Likes Stavros Kiri
  • #38
I voted yes because as there are more self-driving cars (which seem to be statistically safer then the average human driver), then driving should be safer.
However, I am not interested in using one right now.

However, as @Greg Bernhardt mentioned, the first few versions of self-driving cars would give me mixed feelings because they want you to (as I understand it) to be ready to take over when some weirdness, which the computer can't handle, arises. This would make me want to be aware of all the usual driving issues which would take away from what I see as the primary benefit of having a self-driving car, which is not having to pay attention to the driving issues and to take a nap, read something, or whatever (similar to being on an airplane or train).
These different human tendencies will be in conflict until later versions make it less relevant.

A real benefit I see of self-driving cars would be a much greater awareness of things in blind spots to be avoided.

I also like driving (unless I'm sleepy or want to do something else).
I prefer a stick shift which forces you to be more involved with the functioning of the car. More fun.
This might be lost, but presumably there would be a manual version available for use when desired.

Another issue that we discussed at the Portland meet-up a few weeks ago was what if you wanted to go faster than the speed limit for some reason (almost everyone does this on I-5 in my area).
Would the car let you?
If it did, would it modify its behavior if it saw a cop car ahead (like a person would)?
(Why would the cops let this info out?)
(Who would get the ticket, presumably the human)?

Turns out, my son already has an app on his phone that tells you (fairly accurately) when you are coming up on cops on the road.
It uses crowd sourced information from other drivers. We road tested it going to the eclipse.
However, in my opinion, it takes too much attention for a properly involved driver to use (unless maybe if you can just talk to it).
 
  • #39
UWouldKnow said:
...the only crashes I've been in could have been avoided with a self driving car
While i expect self driving cars to be safer, we do need to be careful about flawed data analysis when it comes to their safety. The types of accidents people and self driving cars get into are different in at least some cases so it is possible to say 100% of human caused accidents would have been avoided while still having no idea whether the self driving car is safer because we have no idea what type of accidents the self driving car is susceptible to until we have data on it. The fatal Tesla accident with the truck is such an example.

Similarly, it's nice the Google cars have apparently been safe, but does their experience really translate? City driving causes a lot of minor accidents but almost no deaths because the speeds are so low. How does a Google car do on a highway at 70mph when suddenly losing a lane marker? Unfortunately, the only way to find out what types of accidents they are susceptible to is for them to get into tough situations and potentially get in accidents.
 
  • #40
russ_watters said:
Unfortunately, the only way to find out what types of accidents they are susceptible to is for them to get into tough situations and potentially get in accidents.
Simulations (Physical or virtual)?
 
  • #41
WWGD said:
Simulations (Physical or virtual)?
Yes, I'm sure they are doing extensive simulations. But the problem is that you only simulate what you know you should simulate. The types of problems I am most worried about are where the car doesn't know it is in trouble and as a result, there is nothing to simulate (or the simulation produces no result). The Tesla crash I mentioned above is such an incident. My understanding is that the car never recognized there was a hazard, which is why it never took action, much less notified the driver that it was unable to deal with the situation. If the driver had taken control and avoided the accident, there likely would have been nothing to flag Tesla that the software had failed to recognize a hazard and that they should work on fixing the software glitch (my understanding is that the cars are constantly collecting data and reporting it back to Tesla to use in such simulations). We can be sure by now they simulated it, because someone died and therefore they had to investigate. But heck, I bet the first few times the engineers ran simulations of the accident after the fact, the computer reported to them that no accident happened. It was a literal and figurative blind spot.

Now, these features are still in development and I previously criticized Tesla for using their customers as beta-test subjects of something that could kill them. Hopefully by now better controls are in place to avoid that, but I'm still not convinced that this accident could have been avoided by the driver. Most of these types of features have warnings for the driver if the computer loses control and legalese in their owners manuals to protect the car company by saying the computer never has final control, but that legalese won't protect the driver. The driver in the Tesla crash would have had to think about whether or not the computer saw the truck and decide accurately an in time that it didn't and what action to take. He very well might have been thinking "oh, there's no way it doesn't see this truck" until it was past the point of being able to avoid it. Ultimately though, we want true driverless cars, so that's a little out of bounds.

What's needed is that the engineers designing these things have had enough time and money to put enough effort into designing the simulations that they've thought of every realistic hazard to throw at the car. And since these systems are almost certainly all proprietary, that's a lot of different companies doing an enormous number of simulations.
 
  • #43
All is well until glitches happen. Systems, no matter how well-built, will always have glitches, bugs, or whatever you call it — those glitches can be the system's fault or other external factors.
 
  • Like
Likes ISamson
  • #44
xblaze said:
All is well until glitches happen. Systems, no matter how well-built, will always have glitches, bugs, or whatever you call it — those glitches can be the system's fault or other external factors.
This is true for both proposed modes of vehicle operation in this thread. The question is which one will have the least amount and lowest severity of glitches.
 
  • Like
Likes NFuller
  • #45
WWGD said:
But I think the standard is: is it safer when humans do it? No method will be completely safe.

But I think there are some key features of a drunk person that can be programmed into a computer.
 
  • #46
Orodruin said:
This is true for both proposed modes of vehicle operation in this thread. The question is which one will have the least amount and lowest severity of glitches.
With that in mind, we could just hope for the best. There are cons to each operations as there are pros. Humans err, mostly, because of acting on their emotions; machines, such as the topic at hand, on the way they're built or programmed.

Here's an additional reading, a company blog I came across with when I was looking for companies related to robotics: http://www.powerjackmotion.com/make-way-smart-robots/ (It's Time to Make Way for Smart Robots in Your Industry!). One of the topics there is about self-driving, but the discussion is quite introductory, and you have to dig deeper in order to know more about the subject matter.
 
  • #47
A mother and her 11 month old child were t-boned and killed at an intersection a couple blocks from me yesterday. Bring on autonomous cars ASAP!
 
  • Like
Likes russ_watters and Stavros Kiri
  • #48
Greg Bernhardt said:
A mother and her 11 month old child were t-boned and killed at an intersection a couple blocks from me yesterday. Bring on autonomous cars ASAP!
It would have been better if their deaths were the result of a driverless machine? Less guilt maybe?
 
  • Like
Likes 256bits
  • #49
jack action said:
It would have been better if their deaths were the result of a driverless machine? Less guilt maybe?
It was a drunk driver going 70 through a red in a 35mph road. Would a machine allow that?
 
  • #50
jack action said:
It would have been better if their deaths were the result of a driverless machine? Less guilt maybe?
The obvious point being made was that it would not have happened if the car was autonomous.
 
  • #51
Orodruin said:
The obvious point being made was that it would not have happened if the car was autonomous.
My point is that people will still die in horrible accidents, autonomous vehicle or not.
 
  • Like
Likes 256bits
  • #52
jack action said:
My point is that people will still die in horrible accidents, autonomous vehicle or not.
Which is a moot point unless you consider the rate at which it happens. Women still die in childbirth today. Does that mean that the medical care in relation to childbirth today is on the same level as 500 years ago?
 
  • Like
Likes StoneTemplePython
  • #53
jack action said:
It would have been better if their deaths were the result of a driverless machine? Less guilt maybe?

I would claim it would be better, since the likelihood of such events would be much less for a decent autonomous vehicle than for a human driven one.
In the case of an autonomous vehicle, such an event would not be a popularized indicator (through a single evocative story that people can relate to) of a much larger body of similar occurrences (which fade into a reduced concern about statistical facts, psychologically speaking) as it would be in the case of the human driver.

Sure it is a terrible thing when such accidents occur, and in that sense they are equivalent.
However, in a larger (more statistical) view of things, they are not equivalent.

Its a trees and forest point of view thing. Which point of view do you use when making some value judgement?
You can't really use both (perform an analysis based on both points of view and assume they will both lead to the same conclusion).
One's a view at an individual level, the other is a more global point of view.
 
  • Like
Likes 256bits
  • #54
@Greg Bernhardt , @Orodruin , @BillTre :

Greg's post evoked in me a feeling that I expressed before on this forum and I just realized it was in this particular thread, 3 weeks ago, in post #29.

With the excitement Greg had while presenting is opinion, it's sounds to me like the best way to make the best decision is to not make one at all and count on a more «knowledgeable» someone or, in this case, something.

I'm not even debating the fact that machines can make better decisions in a statistical sense. For the sake of argument, let's accept they do.

What are the consequences of a human being not making any decision? What is the point of living? Let's even consider the case of the drunk driver who had an accident. He made a bad decision, true. Why? What brought this person to that point? Is removing this person's entitlement to make decisions helping him or her? Is this person just suppose to say: «I don't have to do anything, anyway I'm not good enough, let the machine do it»?

I know the subject of this thread is self-driving cars, but it seems nobody wants to make decisions anymore. A way to disempower oneself and others. Sure, it looks nice when you look at people making bad decisions. But how are we suppose to learn how to differentiate good from bad, if we make less and less decisions? How will we know if the machine made the right decision, if we don't even develop our own judgement? Are we going to doubt ourselves all the time? Should AI decide who will run the country? After all, it will probably make a better decision than the average voter, right? When will we end this journey where we say: «People shouldn't be allowed to do that»?

It's really the «Bring on autonomous cars ASAP!» comment that bothers me. I don't believe it is the Holy Grail. I'm not even sure the problem it supposes to solve is that big of a problem. Most people don't make bad decisions when behind the wheel. It is even far from being a major cause of death for the human race. Statistics, right? It works both ways.

I like the concept of machines assisting humans, but I don't like when humans are removed from the decision process. It is a very important one, not only for the action of the moment, but for the development of the individual as well.

So to relate all of this to this thread - Do you feel safer with self-driving cars on the road? - I'm more afraid of people loosing their ability to make good decisions in general than the death toll caused by car accidents due to bad decisions.
 
  • #55
jack action said:
So to relate all of this to this thread - Do you feel safer with self-driving cars on the road? - I'm more afraid of people loosing their ability to make good decisions in general than the death toll caused by car accidents due to bad decisions.
So when I am out late drinking, should I call the uber or drive home? The good decision is to call the uber or start up you own autonomous car.

Most people don't make bad decisions when behind the wheel. It is even far from being a major cause of death for the human race. Statistics, right? It works both ways.

Agree to disagree, I think people in general are terrible drivers. Each time I drive, I see people driving crazy and dangerously. I see people blowing through red lights every single day.
 
  • Like
Likes Orodruin
  • #56
If I am not mistaken - even with the one Tesla fatality the death rate was like 1 in 110M mi driven, vs 90M Mi for human drivers, as an early product fault, reliability typically increases 10 to 20 fold after the failures are identified and accounted for, I would not be surprised to see the final rate be better than a 90% reduction. Given this, IMO, they are already safer then human drivers, by a considerable margin.

Not to mention - the technology can be applied most heavily to the highest risk drivers, teens that do not care about driving themselves, fatigued drivers, drunk and the elderly.

The human intervention model is a fools errand to make the public feel better, if you are not engaged in the act of driving there is very little likelihood you can instantly and effectively take over control and be aware of the entire situation - like when you boss calls oyu out in a meeting for not paying attention, you're scrwed. That is is just not human nature.

Then beyond the full autonomous is the amount of experience the vehicles have with basic augmentation - the amount of situational learning and remarkable (to me) few number of failures needs to be considered. The number of sensors and types and quantity of data being used is dramatically more than a human uses. The amount of learned experience is cumulative, and hard coded in. Humans only learn what they specifically have been taught, we do not get the collective experience of the other drivers.

The vast majority of accidents are not caused by an unusual situation - they are caused by human fallibility; inattentiveness, fatigue, anger, arrogance (thinking you are better then you are), etc. These are exactly the same factors that cause general safety issues, to me, it is about removing the least reliable element.

So clearly I was a Yes.

The more interesting debate - discussion is how to deal with the disruption to the general economy.
 
  • Like
Likes BillTre
  • #57
Greg Bernhardt said:
Agree to disagree, I think people in general are terrible drivers. Each time I drive, I see people driving crazy and dangerously. I see people blowing through red lights every single day.
I very much agree with this. To the contrary of jack's comment, I believe all humans take bad decisions when driving and I will go so far as saying that I am sure I do them myself. Not to the extent of something possibly life-threatening every time I am behind the wheel, but they will definitely happen and if they happen at the wrong moment they may cost me or someone else their life. I do think that I am a good enough driver for the expectation value of the number of dead due to my driving being significantly smaller than one, but if you have enough people like me driving - statistics will get someone in the end and that someone's life will be ruined or lost.

In fact, I do not see any reason except vanity why everyone should feel it a "right" to drive. In cities where public transport is well developed, there is already very little need for every person to be able to drive. When I take the commuter train tomorrow morning, I will be one among a thousand people on that train out of which 999 will not be driving it. What does one more matter in that respect?
 
  • #58
Orodruin said:
I very much agree with this. To the contrary of jack's comment, I believe all humans take bad decisions when driving and I will go so far as saying that I am sure I do them myself. Not to the extent of something possibly life-threatening every time I am behind the wheel, but they will definitely happen and if they happen at the wrong moment they may cost me or someone else their life.
Sure I am biased, but I think I'm a good driver, but certainly I can think of at least a handful of mistakes I've made in the past that if conditions were a little different they could have caused a significant accident.
 
  • Like
Likes Stavros Kiri
  • #59
Teleportation is the answer.
Just set the co-ordinates and then arrive where you want to be in a few seconds.
I voted in favor of automated transport systems.
 
  • Like
Likes Stavros Kiri
  • #60
Greg Bernhardt said:
I think people in general are terrible drivers.
This a very pessimistic view based on irrational fear. I'm pretty sure that most people that take the road on a single day arrive at their destination without any bad event whatsoever. Something in the order of 99.99...% and more. To me that most likely means that - statistically - people make good decisions regardless of one's opinion on «what could've happen». The reality is that it doesn't.

I often hear people say «He was lucky, it could have been worst.» But to me, it seems that the reality is more often «He was unlucky, it usually doesn't end this way.» Looking at life that way, gives you a more optimistic (realistic?) view of the world.
Orodruin said:
I believe all humans take bad decisions when driving and I will go so far as saying that I am sure I do them myself.
Again, this little faith in human kind is what fascinates me. It's like if being a human being was some sort of disease that needed to be cured.

You are also introducing that self-doubt I was talking about. Regardless of what seem to be a good driving record for yourself, you seem to still prefer not making decisions, leaving it to a train operator or AI. What else do you think others can do better than you? Where will you stop? Because I can assure you that there is always someone that can do things better than you, probably even in your field of expertise.

It might sounds cliché, but the only way you know there are good decisions, it's because there are bad decisions. There is no way around it. Wanting to eradicate the world of bad decisions IS a bad decision.

Again, I'm OK with making better machines. But do it for the right reasons. Do it for the fun of it, not to save the human race from itself.
 
  • Like
Likes Stavros Kiri

Similar threads

  • · Replies 4 ·
Replies
4
Views
2K
Replies
10
Views
5K
  • · Replies 28 ·
Replies
28
Views
5K
Replies
7
Views
6K
  • · Replies 2 ·
Replies
2
Views
4K
  • · Replies 19 ·
Replies
19
Views
12K
Replies
67
Views
16K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 17 ·
Replies
17
Views
4K