Do you feel safer with self-driving cars on the road?

Do you feel safer with self-driving cars on the road?

  • Yes

    Votes: 31 40.8%
  • No

    Votes: 38 50.0%
  • No opinion

    Votes: 7 9.2%

  • Total voters
    76
  • #51
jack action
Science Advisor
Insights Author
Gold Member
1,986
2,162
The obvious point being made was that it would not have happened if the car was autonomous.
My point is that people will still die in horrible accidents, autonomous vehicle or not.
 
  • Like
Likes 256bits
  • #52
Orodruin
Staff Emeritus
Science Advisor
Homework Helper
Insights Author
Gold Member
16,829
6,643
My point is that people will still die in horrible accidents, autonomous vehicle or not.
Which is a moot point unless you consider the rate at which it happens. Women still die in childbirth today. Does that mean that the medical care in relation to childbirth today is on the same level as 500 years ago?
 
  • Like
Likes StoneTemplePython
  • #53
BillTre
Science Advisor
Gold Member
1,540
3,408
It would have been better if their deaths were the result of a driverless machine? Less guilt maybe?
I would claim it would be better, since the likelihood of such events would be much less for a decent autonomous vehicle than for a human driven one.
In the case of an autonomous vehicle, such an event would not be a popularized indicator (through a single evocative story that people can relate to) of a much larger body of similar occurrences (which fade into a reduced concern about statistical facts, psychologically speaking) as it would be in the case of the human driver.

Sure it is a terrible thing when such accidents occur, and in that sense they are equivalent.
However, in a larger (more statistical) view of things, they are not equivalent.

Its a trees and forest point of view thing. Which point of view do you use when making some value judgement?
You can't really use both (perform an analysis based on both points of view and assume they will both lead to the same conclusion).
One's a view at an individual level, the other is a more global point of view.
 
  • Like
Likes 256bits
  • #54
jack action
Science Advisor
Insights Author
Gold Member
1,986
2,162
@Greg Bernhardt , @Orodruin , @BillTre :

Greg's post evoked in me a feeling that I expressed before on this forum and I just realized it was in this particular thread, 3 weeks ago, in post #29.

With the excitement Greg had while presenting is opinion, it's sounds to me like the best way to make the best decision is to not make one at all and count on a more «knowledgeable» someone or, in this case, something.

I'm not even debating the fact that machines can make better decisions in a statistical sense. For the sake of argument, let's accept they do.

What are the consequences of a human being not making any decision? What is the point of living? Let's even consider the case of the drunk driver who had an accident. He made a bad decision, true. Why? What brought this person to that point? Is removing this person's entitlement to make decisions helping him or her? Is this person just suppose to say: «I don't have to do anything, anyway I'm not good enough, let the machine do it»?

I know the subject of this thread is self-driving cars, but it seems nobody wants to make decisions anymore. A way to disempower oneself and others. Sure, it looks nice when you look at people making bad decisions. But how are we suppose to learn how to differentiate good from bad, if we make less and less decisions? How will we know if the machine made the right decision, if we don't even develop our own judgement? Are we going to doubt ourselves all the time? Should AI decide who will run the country? After all, it will probably make a better decision than the average voter, right? When will we end this journey where we say: «People shouldn't be allowed to do that»?

It's really the «Bring on autonomous cars ASAP!» comment that bothers me. I don't believe it is the Holy Grail. I'm not even sure the problem it supposes to solve is that big of a problem. Most people don't make bad decisions when behind the wheel. It is even far from being a major cause of death for the human race. Statistics, right? It works both ways.

I like the concept of machines assisting humans, but I don't like when humans are removed from the decision process. It is a very important one, not only for the action of the moment, but for the development of the individual as well.

So to relate all of this to this thread - Do you feel safer with self-driving cars on the road? - I'm more afraid of people loosing their ability to make good decisions in general than the death toll caused by car accidents due to bad decisions.
 
  • #55
18,242
7,865
So to relate all of this to this thread - Do you feel safer with self-driving cars on the road? - I'm more afraid of people loosing their ability to make good decisions in general than the death toll caused by car accidents due to bad decisions.
So when I am out late drinking, should I call the uber or drive home? The good decision is to call the uber or start up you own autonomous car.

Most people don't make bad decisions when behind the wheel. It is even far from being a major cause of death for the human race. Statistics, right? It works both ways.
Agree to disagree, I think people in general are terrible drivers. Each time I drive, I see people driving crazy and dangerously. I see people blowing through red lights every single day.
 
  • Like
Likes Orodruin
  • #56
1,377
349
If I am not mistaken - even with the one Tesla fatality the death rate was like 1 in 110M mi driven, vs 90M Mi for human drivers, as an early product fault, reliability typically increases 10 to 20 fold after the failures are identified and accounted for, I would not be surprised to see the final rate be better than a 90% reduction. Given this, IMO, they are already safer then human drivers, by a considerable margin.

Not to mention - the technology can be applied most heavily to the highest risk drivers, teens that do not care about driving themselves, fatigued drivers, drunk and the elderly.

The human intervention model is a fools errand to make the public feel better, if you are not engaged in the act of driving there is very little likelihood you can instantly and effectively take over control and be aware of the entire situation - like when you boss calls oyu out in a meeting for not paying attention, you're scrwed. That is is just not human nature.

Then beyond the full autonomous is the amount of experience the vehicles have with basic augmentation - the amount of situational learning and remarkable (to me) few number of failures needs to be considered. The number of sensors and types and quantity of data being used is dramatically more than a human uses. The amount of learned experience is cumulative, and hard coded in. Humans only learn what they specifically have been taught, we do not get the collective experience of the other drivers.

The vast majority of accidents are not caused by an unusual situation - they are caused by human fallibility; inattentiveness, fatigue, anger, arrogance (thinking you are better then you are), etc. These are exactly the same factors that cause general safety issues, to me, it is about removing the least reliable element.

So clearly I was a Yes.

The more interesting debate - discussion is how to deal with the disruption to the general economy.
 
  • Like
Likes BillTre
  • #57
Orodruin
Staff Emeritus
Science Advisor
Homework Helper
Insights Author
Gold Member
16,829
6,643
Agree to disagree, I think people in general are terrible drivers. Each time I drive, I see people driving crazy and dangerously. I see people blowing through red lights every single day.
I very much agree with this. To the contrary of jack's comment, I believe all humans take bad decisions when driving and I will go so far as saying that I am sure I do them myself. Not to the extent of something possibly life-threatening every time I am behind the wheel, but they will definitely happen and if they happen at the wrong moment they may cost me or someone else their life. I do think that I am a good enough driver for the expectation value of the number of dead due to my driving being significantly smaller than one, but if you have enough people like me driving - statistics will get someone in the end and that someone's life will be ruined or lost.

In fact, I do not see any reason except vanity why everyone should feel it a "right" to drive. In cities where public transport is well developed, there is already very little need for every person to be able to drive. When I take the commuter train tomorrow morning, I will be one among a thousand people on that train out of which 999 will not be driving it. What does one more matter in that respect?
 
  • #58
18,242
7,865
I very much agree with this. To the contrary of jack's comment, I believe all humans take bad decisions when driving and I will go so far as saying that I am sure I do them myself. Not to the extent of something possibly life-threatening every time I am behind the wheel, but they will definitely happen and if they happen at the wrong moment they may cost me or someone else their life.
Sure I am biased, but I think I'm a good driver, but certainly I can think of at least a handful of mistakes I've made in the past that if conditions were a little different they could have caused a significant accident.
 
  • Like
Likes Stavros Kiri
  • #59
3,379
943
Teleportation is the answer.
Just set the co-ordinates and then arrive where you want to be in a few seconds.
I voted in favor of automated transport systems.
 
  • Like
Likes Stavros Kiri
  • #60
jack action
Science Advisor
Insights Author
Gold Member
1,986
2,162
I think people in general are terrible drivers.
This a very pessimistic view based on irrational fear. I'm pretty sure that most people that take the road on a single day arrive at their destination without any bad event whatsoever. Something in the order of 99.99...% and more. To me that most likely means that - statistically - people make good decisions regardless of one's opinion on «what could've happen». The reality is that it doesn't.

I often hear people say «He was lucky, it could have been worst.» But to me, it seems that the reality is more often «He was unlucky, it usually doesn't end this way.» Looking at life that way, gives you a more optimistic (realistic?) view of the world.
I believe all humans take bad decisions when driving and I will go so far as saying that I am sure I do them myself.
Again, this little faith in human kind is what fascinates me. It's like if being a human being was some sort of disease that needed to be cured.

You are also introducing that self-doubt I was talking about. Regardless of what seem to be a good driving record for yourself, you seem to still prefer not making decisions, leaving it to a train operator or AI. What else do you think others can do better than you? Where will you stop? Because I can assure you that there is always someone that can do things better than you, probably even in your field of expertise.

It might sounds cliché, but the only way you know there are good decisions, it's because there are bad decisions. There is no way around it. Wanting to eradicate the world of bad decisions IS a bad decision.

Again, I'm OK with making better machines. But do it for the right reasons. Do it for the fun of it, not to save the human race from itself.
 
  • Like
Likes Stavros Kiri
  • #61
Spinnor
Gold Member
2,150
354
Something in the order of 99.99...% and more.
More then 30,000 deaths and probably hundreds of thousands of injuries last year.

700px-Motor_vehicle_deaths_in_the_US.svg.png
That is a lot of pain and suffering.
 
  • Like
Likes ISamson, russ_watters, Stavros Kiri and 1 other person
  • #62
Orodruin
Staff Emeritus
Science Advisor
Homework Helper
Insights Author
Gold Member
16,829
6,643
To me that most likely means that - statistically - people make good decisions regardless of one's opinion on «what could've happen». The reality is that it doesn't.
That is an illusion based on most potentially fatal mistakes actually not being fatal. That you arrive safely is correlated, but not equivalent to you driving safely and taking good decisions. For example, failing to pay proper attention when driving across a seldomly crossed zebra crossing. This is a mistake that is going to go by completely unnoticed until it doesn’t. It does not make the mistake any less of a mistake.

You are also introducing that self-doubt I was talking about. Regardless of what seem to be a good driving record for yourself, you seem to still prefer not making decisions, leaving it to a train operator or AI. What else do you think others can do better than you? Where will you stop? Because I can assure you that there is always someone that can do things better than you, probably even in your field of expertise.
What you describe I would describe as severe hubris. There is also a clear difference in drawing lines. It makes sense for me to continue doing what I do because I add (at least in some part) to the research in my field. Are there people better than me? Sure, but they cannot do everything themselves. This is clearly not the case with autonomous vehicles. Or with cars in general - you do not need more than one driver (or zero in the case of the autonomous car). More drivers will not make the car safer or accomplish its task better.

When it comes to governing, I believe the last few years have clearly shown that a main flaw in democracy is that people are easy to influence with feelings and emotions based on false or invented facts. If sufficiently advanced and benevolent, I would be prepared to handle government over to an AI.

You are talking about people handing over their decisions to a machine, removing their own decisions, but the truth is that many people already hand many of their decisions over to others. The only difference being that these others happen to be other humans (most of the time).

It might sounds cliché, but the only way you know there are good decisions, it's because there are bad decisions.
I think this is nonsense to be honest. The way that you weight decisions is by predicting and weighting outcomes against each other. You do not need to do it to know that hitting on 20 when the dealer shows a 6 is a bad decision.

But do it for the right reasons.
I would argue that saving human lives is a good reason. In particular if it only comes at the expense of humans taking monotonous decisions prone to error. The car is not deciding where you should go. It is removing a monotonous task that most peoples’ brains struggle with.
 
  • Like
Likes russ_watters and Stavros Kiri
  • #63
3,379
943
Elevators / lifts, generally are considered to be better than walking up a ten story building.
 
  • #64
791
602
It was a drunk driver going 70 through a red in a 35mph road. Would a machine allow that?
Only if it malfunctioned. Could it happen? Yes (and it will happen), but people "malfunction" a lot more often! ... [if not every day, all the time - some people, at least]
Stop sign and red light violations are almost more often than the non-violations! Then there is speeding and drunk driving ...
[I had some good video links, showing statistics - if I can find them]

Machines normally wouldn't do any of that. Codes are explicit.
[That alone drops down the chances for accidents at least by 90%, I think ...]
 
  • Like
Likes Orodruin, ISamson and rootone
  • #65
3,379
943
Code malfunctions can be corrected, drunk drivers not so easy,
 
  • Like
Likes ISamson and Stavros Kiri
  • #66
791
602
This a very pessimistic view based on irrational fear. I'm pretty sure that most people that take the road on a single day arrive at their destination without any bad event whatsoever. Something in the order of 99.99...% and more. To me that most likely means that - statistically - people make good decisions regardless of one's opinion on «what could've happen». The reality is that it doesn't.
You have a good point there, and in that whole post of yours! I think it's a very wise post, although I vote for self-driving cars, for various reasons.
But I think one has to also see the statistics for accidents that do happen (on a day) versus the number of mistakes and violations (huge!, on the said day) that did or did not cause an accident. You in fact want to eliminate all that, regardless of the 99.99...% that you correctly perhaps refer to. And machines almost do that.
 
  • #67
russ_watters
Mentor
19,781
6,178
@Greg Bernhardt , @Orodruin , @BillTre :

Greg's post evoked in me a feeling that I expressed before on this forum and I just realized it was in this particular thread, 3 weeks ago, in post #29.

What are the consequences of a human being not making any decision? What is the point of living?

I know the subject of this thread is self-driving cars, but it seems nobody wants to make decisions anymore. A way to disempower oneself and others. Sure, it looks nice when you look at people making bad decisions. But how are we suppose to learn how to differentiate good from bad, if we make less and less decisions? How will we know if the machine made the right decision, if we don't even develop our own judgement? Are we going to doubt ourselves all the time? Should AI decide who will run the country? After all, it will probably make a better decision than the average voter, right? When will we end this journey where we say: «People shouldn't be allowed to do that»?
I really think you are overthinking this. Mostly what we want machines to do for us are the things that are too hard or boring or dangerous. It doesn't get in the way of us living, it frees us to do the living we really want to do.
It's really the «Bring on autonomous cars ASAP!» comment that bothers me. I don't believe it is the Holy Grail. I'm not even sure the problem it supposes to solve is that big of a problem. Most people don't make bad decisions when behind the wheel. It is even far from being a major cause of death for the human race. Statistics, right?
No, actually it really is a very significant cause of death for humans in developed countries like the US. It's higher than 1% overall and depending on your demographic, can be very much higher than 1%.
https://www.cdc.gov/injury/wisqars/overview/key_data.html

Cancer and heart disease are far and away the most significant risks of death, but since they almost exclusively happen to old people, for every other age group except newborns, "unintentional injury" is the leading cause of death, with car accidents making up the largest fraction of that (from above link).
https://www.cdc.gov/injury/images/lc-charts/leading_causes_of_death_age_group_2015_1050w740h.gif

However, narrowly there is a potential salient point here:
I'm more afraid of people loosing their ability to make good decisions in general than the death toll caused by car accidents due to bad decisions.
This is indeed a potential downside and does happen due to too much reliance on automation. Many plane crashes (example: Air France 447) happen because of over-reliance on automation causing pilots to lose their skills or mis-perceive what the computers are telling them. At the same time, one can imagine the increase in self-driving cars to correlate to an increase in alcohol abuse and alcoholism, since removing the need to drive home removes one incentive to behave responsibly. These unintended consequences may be hard to identify, but that's largely because they are much less common/significant than the primary consequence (the increased safety). So whereas today automation failures cause a much more significant fraction of plane crashes today than they used to, overall there are far fewer plane crashes and fewer resulting deaths. The same positive trade-off will almost certainly be true of self-driving cars.
I'm pretty sure that most people that take the road on a single day arrive at their destination without any bad event whatsoever. Something in the order of 99.99...% and more. To me that most likely means that - statistically - people make good decisions regardless of one's opinion on «what could've happen».
This is an improper way to look at the statistics: you are ignoring how often you play the game. Your chances of winning the lottery might be 1 in a million, but if you buy half a million lottery tickets, your chances of winning are 50%. In other words, your individual odds of dying on any particular car ride are very small, but you take a lot of car rides, so your annual or lifetime risk is fairly significant.
It might sounds cliché, but the only way you know there are good decisions, it's because there are bad decisions.
Well that's just silly. You don't need to be a genius to know that running a red light is dumb/dangerous and you don't need to test it either. I don't need to actually [chooses random object in field of view] pull a curtain rod off my wall and stab myself with it to know that would be a dumb thing to do. Humans are plenty smart enough to weigh decisions they have never taken.
 
Last edited:
  • Like
Likes Stavros Kiri
  • #68
jack action
Science Advisor
Insights Author
Gold Member
1,986
2,162
That is an illusion based on most potentially fatal mistakes actually not being fatal. That you arrive safely is correlated, but not equivalent to you driving safely and taking good decisions. For example, failing to pay proper attention when driving across a seldomly crossed zebra crossing. This is a mistake that is going to go by completely unnoticed until it doesn’t. It does not make the mistake any less of a mistake.
I live in a city where there is a zoo. About 50 years ago, a lion escaped and spread terror to the point where it was shot to death. There is still a zoo today, they still have lions. Is it a mistake on my part to not check for lions on my porch before getting out of my house?

Making decisions is all about probability and, yes, not paying attention when driving across a seldomly crossed zebra crossing is NOT a mistake, from my point of view. The proof lies in the results. But the chances are always there and the hit is inevitable given time. That is why I say about that accident «That person was unlucky, it usually doesn't happen» and not to the thousands of other people who did not have an accident «You were lucky, you could've hit a zebra!» I can assure you that this is how AI would make its decisions as well.
When it comes to governing, I believe the last few years have clearly shown that a main flaw in democracy is that people are easy to influence with feelings and emotions based on false or invented facts. If sufficiently advanced and benevolent, I would be prepared to handle government over to an AI.
That is scary. The solution to that problem is to raise people that can make better decisions, not to replace them with machines. Is your solution to people not well educated, replacing them with machines that have better knowledge? Humans are NOT a lost cause. Otherwise there is no point keeping humans alive.
You are talking about people handing over their decisions to a machine, removing their own decisions, but the truth is that many people already hand many of their decisions over to others. The only difference being that these others happen to be other humans (most of the time).
That is my point, we are on a dangerous path. One where the common man is seen as an unfit animal, unable to care for itself. I don't believe that. I always felt that we should go towards having more people being able to make decisions in all their life aspects and thus contributing to the society in general, not just waiting for someone (or something) else to decide. That is what democracy is.
The way that you weight decisions is by predicting and weighting outcomes against each other. You do not need to do it to know that hitting on 20 when the dealer shows a 6 is a bad decision.
Making decisions is often way more complicated than that. Ask the people of Florida if they should evacuate or not when an hurricane is announced. Not an easy decision to make. How many times are you going to evacuate the entire state «for nothing» before you won't? And when you won't do it, it may be the time you should have. Welcome to life. Can AI do better? I don't think so. The way nature does it is by diversity: Some go, some stay, at least one group survives. The «good» decision is unpredictable.
I would argue that saving human lives is a good reason.
No lives are ever saved. The best you can do is extend one. On the greater scheme of things, I still fail to see what improvement it does to a form of life, human race or any other. I guess it is these «feelings and emotions» that you were talking about that influences you. I wonder if you would appreciate a machine making decisions for you with that cold and objective attitude? After all, I'm a human being and I already have those thoughts. You better hope I won't be the programmer behind the next generation of AI.
Mostly what we want machines to do for us are the things that are too hard or boring or dangerous. It doesn't get in the way of us living, it frees us to do the living we really want to do.
Oh! I love this one! What is it we really want to do? What if what I really want to do is driving? Will I be allowed or will I be forbidden to do it because it is considered too dangerous by many, too afraid I will kill them? Will I be considered abnormal for thinking driving is fun and not boring? What I'm I supposed to really want to do?

Have you noticed that the more we find things boring, the more we find the things we replaced them with, boring as well?
 
  • #69
DaveC426913
Gold Member
18,895
2,399
But heck, I bet the first few times the engineers ran simulations of the accident after the fact, the computer reported to them that no accident happened.
Star Trek - Season 4 Ep 5 "Remember Me"

Tesla, what was that noise?
Explosive decomposition of passenger compartment and passenger.
Cause?
Design Flaw: there appears to be no structure above the mirrors.
 
  • Like
Likes Stavros Kiri and russ_watters
  • #70
russ_watters
Mentor
19,781
6,178
Oh! I love this one! What is it we really want to do? What if what I really want to do is driving?
Is that true or are you just asking for the sake of argument? We're not discussing an abstract hypothetical here, we're discussing a likely near-term reality about a daily task people do that most would rather not. Really. As @Greg Bernhardt said earlier, I can't wait for the time when I don't have to do the mind-numbing task of driving to and from work in traffic for an hour+ a day. I'd much rather be taking a nap, watching TV, reading a book, using PF, etc.
Will I be allowed or will I be forbidden to do it because it is considered too dangerous by many, too afraid I will kill them?
1. There is a wide gulf between "possible" and "mandatory". For a very long time - perhaps until after we're both dead and perhaps forever - self-driving will be a feature we can choose to buy and use or not. Like a washing machine. I would bet money that most who can afford it will choose to buy it (and get rewarded with lower insurance rates). So I don't think the scenario you present (of self-driving being mandatory) is a reflection of any potential reality that is on the table.
2. I like driving too. Sometimes. In certain contexts. So perhaps instead of commuting to work (boring, stressful), I'll go to a test track on weekends and scratch the itch that way.
Will I be considered abnormal for thinking driving is fun and not boring? What I'm I supposed to really want to do?
That's meaningless and irrelevant here. Mature, free thinking adults don't have to ask if what they want to do is considered "abnormal". The question is simply what do you and most people want.
Have you noticed that the more we find things boring, the more we find the things we replaced them with, boring as well?
No, I haven't. That's just silly. Why would I choose to do something more boring if there was something less boring and more enjoyable I could do? I don't go thinking to myself: "Hmm...should a watch a movie right now or not turn on the TV and stare at the blank screen?"
 
  • Like
Likes Orodruin
  • #71
Orodruin
Staff Emeritus
Science Advisor
Homework Helper
Insights Author
Gold Member
16,829
6,643
Making decisions is all about probability and, yes, not paying attention when driving across a seldomly crossed zebra crossing is NOT a mistake, from my point of view.
This is just silly. Lions do not cause tens of thousands of deaths in the US every year. Besides, regardless of whether you consider it a mistake or not, it is actions like that that cause accidents and autonomous cars will not make them to the same extent as human drivers. (Add to that the limited cognitive abilities of humans, such as only beig able to see in one direction at a time.)

No lives are ever saved. The best you can do is extend one. On the greater scheme of things, I still fail to see what improvement it does to a form of life, human race or any other.
Cute. Lets stop using medicine and surgery, it does not save lives anyway. Would you accept your doctor saying ”this cyst will kill you if left untreated, but you will die from something else otherwise so we wont”. Now who is the cynic here?
 
  • Like
Likes ISamson
  • #72
256bits
Gold Member
3,213
1,226
It's really the «Bring on autonomous cars ASAP!» comment that bothers me. I don't believe it is the Holy Grail. I'm not even sure the problem it supposes to solve is that big of a problem. Most people don't make bad decisions when behind the wheel. It is even far from being a major cause of death for the human race. Statistics, right
That's the problem isn't it.
 
  • #73
OCR
873
732
A mother and her 11 month old child were t-boned and killed at an intersection a couple blocks from me yesterday.
The obvious point being made was that it would not have happened if the car was autonomous.
The obvious point being made is not obvious at all...
Unless, what you actually meant was...

It would not have happened if the mother's car was autonomous, and could have recognize the fact...
It was a drunk driver going 70 through a red in a 35mph road.
... and would have stopped her car, before the intersection collision.

Yes...
Would a machine allow that?
Both... a drunk driver going 70 through a red in a 35mph road, and stopping another car before an intersection collision, even if a light were green...

Every body seems to be obsessed with the drunk driver here, but remember... it's a "two-way street".
 
  • Like
Likes Greg Bernhardt
  • #74
OCR
873
732
Brilliant... . :thumbup:
Star Trek - Season 4 Ep 5 "Remember Me"

Tesla, what was that noise?
Explosive decomposition of passenger compartment and passenger.
Cause?
Design Flaw: there appears to be no structure above the mirrors.
 
  • Like
Likes Stavros Kiri
  • #75
Orodruin
Staff Emeritus
Science Advisor
Homework Helper
Insights Author
Gold Member
16,829
6,643
Every body seems to be obsessed with the drunk driver here, but remember... it's a "two-way street".
I believe you are wrong. It would have been sufficient for the drunk's car to be autonomous. Replacing that car would have been sufficient to avoid the accident. Of course, if you replace all cars by autonomous ones, also the mother's would have been.

What would happen if you just replaced the mother's car is a different question. Would the car be able to avoid the collision? I believe the answer to that is that it would be more likely to avoid the collision than a human driver. Autonomous cars are taught to recognise and act on threats that are out of the ordinary (such as a driver running a red light) as well as obeying traffic rules themselves. The advantage in favour of the autonomous car comes from its ability to perceive the entire traffic situation, not just what they happen to be looking at.
 
  • Like
Likes Stavros Kiri and russ_watters

Related Threads on Do you feel safer with self-driving cars on the road?

  • Last Post
3
Replies
54
Views
8K
  • Last Post
Replies
1
Views
2K
  • Last Post
Replies
4
Views
4K
  • Last Post
Replies
1
Views
2K
Replies
2
Views
2K
Replies
53
Views
10K
Replies
5
Views
957
  • Last Post
Replies
10
Views
8K
Replies
6
Views
2K
Replies
7
Views
974
Top