Do you feel safer with self-driving cars on the road?

AI Thread Summary
The discussion centers on the safety perceptions of self-driving cars compared to human drivers. Participants express skepticism about the current capabilities of AI in anticipating complex driving situations, emphasizing that while self-driving cars may statistically reduce accidents, they are not yet widespread enough to enhance overall safety. Concerns are raised about the limitations of sensors and the unpredictability of human behavior, which can lead to accidents that AI may not effectively manage. Some participants look forward to future advancements in self-driving technology, believing that with time, these vehicles could significantly improve road safety. Ultimately, the consensus leans towards cautious optimism, with many agreeing that while self-driving cars may be safer in theory, they do not yet feel comfortable relying on them.

Do you feel safer with self-driving cars on the road?

  • Yes

    Votes: 31 41.3%
  • No

    Votes: 37 49.3%
  • No opinion

    Votes: 7 9.3%

  • Total voters
    75
  • #51
Orodruin said:
The obvious point being made was that it would not have happened if the car was autonomous.
My point is that people will still die in horrible accidents, autonomous vehicle or not.
 
  • Like
Likes 256bits
Computer science news on Phys.org
  • #52
jack action said:
My point is that people will still die in horrible accidents, autonomous vehicle or not.
Which is a moot point unless you consider the rate at which it happens. Women still die in childbirth today. Does that mean that the medical care in relation to childbirth today is on the same level as 500 years ago?
 
  • Like
Likes StoneTemplePython
  • #53
jack action said:
It would have been better if their deaths were the result of a driverless machine? Less guilt maybe?

I would claim it would be better, since the likelihood of such events would be much less for a decent autonomous vehicle than for a human driven one.
In the case of an autonomous vehicle, such an event would not be a popularized indicator (through a single evocative story that people can relate to) of a much larger body of similar occurrences (which fade into a reduced concern about statistical facts, psychologically speaking) as it would be in the case of the human driver.

Sure it is a terrible thing when such accidents occur, and in that sense they are equivalent.
However, in a larger (more statistical) view of things, they are not equivalent.

Its a trees and forest point of view thing. Which point of view do you use when making some value judgement?
You can't really use both (perform an analysis based on both points of view and assume they will both lead to the same conclusion).
One's a view at an individual level, the other is a more global point of view.
 
  • Like
Likes 256bits
  • #54
@Greg Bernhardt , @Orodruin , @BillTre :

Greg's post evoked in me a feeling that I expressed before on this forum and I just realized it was in this particular thread, 3 weeks ago, in post #29.

With the excitement Greg had while presenting is opinion, it's sounds to me like the best way to make the best decision is to not make one at all and count on a more «knowledgeable» someone or, in this case, something.

I'm not even debating the fact that machines can make better decisions in a statistical sense. For the sake of argument, let's accept they do.

What are the consequences of a human being not making any decision? What is the point of living? Let's even consider the case of the drunk driver who had an accident. He made a bad decision, true. Why? What brought this person to that point? Is removing this person's entitlement to make decisions helping him or her? Is this person just suppose to say: «I don't have to do anything, anyway I'm not good enough, let the machine do it»?

I know the subject of this thread is self-driving cars, but it seems nobody wants to make decisions anymore. A way to disempower oneself and others. Sure, it looks nice when you look at people making bad decisions. But how are we suppose to learn how to differentiate good from bad, if we make less and less decisions? How will we know if the machine made the right decision, if we don't even develop our own judgement? Are we going to doubt ourselves all the time? Should AI decide who will run the country? After all, it will probably make a better decision than the average voter, right? When will we end this journey where we say: «People shouldn't be allowed to do that»?

It's really the «Bring on autonomous cars ASAP!» comment that bothers me. I don't believe it is the Holy Grail. I'm not even sure the problem it supposes to solve is that big of a problem. Most people don't make bad decisions when behind the wheel. It is even far from being a major cause of death for the human race. Statistics, right? It works both ways.

I like the concept of machines assisting humans, but I don't like when humans are removed from the decision process. It is a very important one, not only for the action of the moment, but for the development of the individual as well.

So to relate all of this to this thread - Do you feel safer with self-driving cars on the road? - I'm more afraid of people loosing their ability to make good decisions in general than the death toll caused by car accidents due to bad decisions.
 
  • #55
jack action said:
So to relate all of this to this thread - Do you feel safer with self-driving cars on the road? - I'm more afraid of people loosing their ability to make good decisions in general than the death toll caused by car accidents due to bad decisions.
So when I am out late drinking, should I call the uber or drive home? The good decision is to call the uber or start up you own autonomous car.

Most people don't make bad decisions when behind the wheel. It is even far from being a major cause of death for the human race. Statistics, right? It works both ways.

Agree to disagree, I think people in general are terrible drivers. Each time I drive, I see people driving crazy and dangerously. I see people blowing through red lights every single day.
 
  • Like
Likes Orodruin
  • #56
If I am not mistaken - even with the one Tesla fatality the death rate was like 1 in 110M mi driven, vs 90M Mi for human drivers, as an early product fault, reliability typically increases 10 to 20 fold after the failures are identified and accounted for, I would not be surprised to see the final rate be better than a 90% reduction. Given this, IMO, they are already safer then human drivers, by a considerable margin.

Not to mention - the technology can be applied most heavily to the highest risk drivers, teens that do not care about driving themselves, fatigued drivers, drunk and the elderly.

The human intervention model is a fools errand to make the public feel better, if you are not engaged in the act of driving there is very little likelihood you can instantly and effectively take over control and be aware of the entire situation - like when you boss calls oyu out in a meeting for not paying attention, you're scrwed. That is is just not human nature.

Then beyond the full autonomous is the amount of experience the vehicles have with basic augmentation - the amount of situational learning and remarkable (to me) few number of failures needs to be considered. The number of sensors and types and quantity of data being used is dramatically more than a human uses. The amount of learned experience is cumulative, and hard coded in. Humans only learn what they specifically have been taught, we do not get the collective experience of the other drivers.

The vast majority of accidents are not caused by an unusual situation - they are caused by human fallibility; inattentiveness, fatigue, anger, arrogance (thinking you are better then you are), etc. These are exactly the same factors that cause general safety issues, to me, it is about removing the least reliable element.

So clearly I was a Yes.

The more interesting debate - discussion is how to deal with the disruption to the general economy.
 
  • Like
Likes BillTre
  • #57
Greg Bernhardt said:
Agree to disagree, I think people in general are terrible drivers. Each time I drive, I see people driving crazy and dangerously. I see people blowing through red lights every single day.
I very much agree with this. To the contrary of jack's comment, I believe all humans take bad decisions when driving and I will go so far as saying that I am sure I do them myself. Not to the extent of something possibly life-threatening every time I am behind the wheel, but they will definitely happen and if they happen at the wrong moment they may cost me or someone else their life. I do think that I am a good enough driver for the expectation value of the number of dead due to my driving being significantly smaller than one, but if you have enough people like me driving - statistics will get someone in the end and that someone's life will be ruined or lost.

In fact, I do not see any reason except vanity why everyone should feel it a "right" to drive. In cities where public transport is well developed, there is already very little need for every person to be able to drive. When I take the commuter train tomorrow morning, I will be one among a thousand people on that train out of which 999 will not be driving it. What does one more matter in that respect?
 
  • #58
Orodruin said:
I very much agree with this. To the contrary of jack's comment, I believe all humans take bad decisions when driving and I will go so far as saying that I am sure I do them myself. Not to the extent of something possibly life-threatening every time I am behind the wheel, but they will definitely happen and if they happen at the wrong moment they may cost me or someone else their life.
Sure I am biased, but I think I'm a good driver, but certainly I can think of at least a handful of mistakes I've made in the past that if conditions were a little different they could have caused a significant accident.
 
  • Like
Likes Stavros Kiri
  • #59
Teleportation is the answer.
Just set the co-ordinates and then arrive where you want to be in a few seconds.
I voted in favor of automated transport systems.
 
  • Like
Likes Stavros Kiri
  • #60
Greg Bernhardt said:
I think people in general are terrible drivers.
This a very pessimistic view based on irrational fear. I'm pretty sure that most people that take the road on a single day arrive at their destination without any bad event whatsoever. Something in the order of 99.99...% and more. To me that most likely means that - statistically - people make good decisions regardless of one's opinion on «what could've happen». The reality is that it doesn't.

I often hear people say «He was lucky, it could have been worst.» But to me, it seems that the reality is more often «He was unlucky, it usually doesn't end this way.» Looking at life that way, gives you a more optimistic (realistic?) view of the world.
Orodruin said:
I believe all humans take bad decisions when driving and I will go so far as saying that I am sure I do them myself.
Again, this little faith in human kind is what fascinates me. It's like if being a human being was some sort of disease that needed to be cured.

You are also introducing that self-doubt I was talking about. Regardless of what seem to be a good driving record for yourself, you seem to still prefer not making decisions, leaving it to a train operator or AI. What else do you think others can do better than you? Where will you stop? Because I can assure you that there is always someone that can do things better than you, probably even in your field of expertise.

It might sounds cliché, but the only way you know there are good decisions, it's because there are bad decisions. There is no way around it. Wanting to eradicate the world of bad decisions IS a bad decision.

Again, I'm OK with making better machines. But do it for the right reasons. Do it for the fun of it, not to save the human race from itself.
 
  • Like
Likes Stavros Kiri
  • #61
jack action said:
Something in the order of 99.99...% and more.

More then 30,000 deaths and probably hundreds of thousands of injuries last year.

700px-Motor_vehicle_deaths_in_the_US.svg.png
That is a lot of pain and suffering.
 
  • Like
Likes ISamson, russ_watters, Stavros Kiri and 1 other person
  • #62
jack action said:
To me that most likely means that - statistically - people make good decisions regardless of one's opinion on «what could've happen». The reality is that it doesn't.
That is an illusion based on most potentially fatal mistakes actually not being fatal. That you arrive safely is correlated, but not equivalent to you driving safely and taking good decisions. For example, failing to pay proper attention when driving across a seldomly crossed zebra crossing. This is a mistake that is going to go by completely unnoticed until it doesn’t. It does not make the mistake any less of a mistake.

jack action said:
You are also introducing that self-doubt I was talking about. Regardless of what seem to be a good driving record for yourself, you seem to still prefer not making decisions, leaving it to a train operator or AI. What else do you think others can do better than you? Where will you stop? Because I can assure you that there is always someone that can do things better than you, probably even in your field of expertise.
What you describe I would describe as severe hubris. There is also a clear difference in drawing lines. It makes sense for me to continue doing what I do because I add (at least in some part) to the research in my field. Are there people better than me? Sure, but they cannot do everything themselves. This is clearly not the case with autonomous vehicles. Or with cars in general - you do not need more than one driver (or zero in the case of the autonomous car). More drivers will not make the car safer or accomplish its task better.

When it comes to governing, I believe the last few years have clearly shown that a main flaw in democracy is that people are easy to influence with feelings and emotions based on false or invented facts. If sufficiently advanced and benevolent, I would be prepared to handle government over to an AI.

You are talking about people handing over their decisions to a machine, removing their own decisions, but the truth is that many people already hand many of their decisions over to others. The only difference being that these others happen to be other humans (most of the time).

jack action said:
It might sounds cliché, but the only way you know there are good decisions, it's because there are bad decisions.
I think this is nonsense to be honest. The way that you weight decisions is by predicting and weighting outcomes against each other. You do not need to do it to know that hitting on 20 when the dealer shows a 6 is a bad decision.

jack action said:
But do it for the right reasons.
I would argue that saving human lives is a good reason. In particular if it only comes at the expense of humans taking monotonous decisions prone to error. The car is not deciding where you should go. It is removing a monotonous task that most peoples’ brains struggle with.
 
  • Like
Likes russ_watters and Stavros Kiri
  • #63
Elevators / lifts, generally are considered to be better than walking up a ten story building.
 
  • #64
Greg Bernhardt said:
It was a drunk driver going 70 through a red in a 35mph road. Would a machine allow that?
Only if it malfunctioned. Could it happen? Yes (and it will happen), but people "malfunction" a lot more often! ... [if not every day, all the time - some people, at least]
Stop sign and red light violations are almost more often than the non-violations! Then there is speeding and drunk driving ...
[I had some good video links, showing statistics - if I can find them]

Machines normally wouldn't do any of that. Codes are explicit.
[That alone drops down the chances for accidents at least by 90%, I think ...]
 
  • Like
Likes Orodruin, ISamson and rootone
  • #65
Code malfunctions can be corrected, drunk drivers not so easy,
 
  • Like
Likes ISamson and Stavros Kiri
  • #66
jack action said:
This a very pessimistic view based on irrational fear. I'm pretty sure that most people that take the road on a single day arrive at their destination without any bad event whatsoever. Something in the order of 99.99...% and more. To me that most likely means that - statistically - people make good decisions regardless of one's opinion on «what could've happen». The reality is that it doesn't.
You have a good point there, and in that whole post of yours! I think it's a very wise post, although I vote for self-driving cars, for various reasons.
But I think one has to also see the statistics for accidents that do happen (on a day) versus the number of mistakes and violations (huge!, on the said day) that did or did not cause an accident. You in fact want to eliminate all that, regardless of the 99.99...% that you correctly perhaps refer to. And machines almost do that.
 
  • #67
jack action said:
@Greg Bernhardt , @Orodruin , @BillTre :

Greg's post evoked in me a feeling that I expressed before on this forum and I just realized it was in this particular thread, 3 weeks ago, in post #29.

What are the consequences of a human being not making any decision? What is the point of living?

I know the subject of this thread is self-driving cars, but it seems nobody wants to make decisions anymore. A way to disempower oneself and others. Sure, it looks nice when you look at people making bad decisions. But how are we suppose to learn how to differentiate good from bad, if we make less and less decisions? How will we know if the machine made the right decision, if we don't even develop our own judgement? Are we going to doubt ourselves all the time? Should AI decide who will run the country? After all, it will probably make a better decision than the average voter, right? When will we end this journey where we say: «People shouldn't be allowed to do that»?
I really think you are overthinking this. Mostly what we want machines to do for us are the things that are too hard or boring or dangerous. It doesn't get in the way of us living, it frees us to do the living we really want to do.
It's really the «Bring on autonomous cars ASAP!» comment that bothers me. I don't believe it is the Holy Grail. I'm not even sure the problem it supposes to solve is that big of a problem. Most people don't make bad decisions when behind the wheel. It is even far from being a major cause of death for the human race. Statistics, right?
No, actually it really is a very significant cause of death for humans in developed countries like the US. It's higher than 1% overall and depending on your demographic, can be very much higher than 1%.
https://www.cdc.gov/injury/wisqars/overview/key_data.html

Cancer and heart disease are far and away the most significant risks of death, but since they almost exclusively happen to old people, for every other age group except newborns, "unintentional injury" is the leading cause of death, with car accidents making up the largest fraction of that (from above link).
https://www.cdc.gov/injury/images/lc-charts/leading_causes_of_death_age_group_2015_1050w740h.gif

However, narrowly there is a potential salient point here:
I'm more afraid of people loosing their ability to make good decisions in general than the death toll caused by car accidents due to bad decisions.
This is indeed a potential downside and does happen due to too much reliance on automation. Many plane crashes (example: Air France 447) happen because of over-reliance on automation causing pilots to lose their skills or mis-perceive what the computers are telling them. At the same time, one can imagine the increase in self-driving cars to correlate to an increase in alcohol abuse and alcoholism, since removing the need to drive home removes one incentive to behave responsibly. These unintended consequences may be hard to identify, but that's largely because they are much less common/significant than the primary consequence (the increased safety). So whereas today automation failures cause a much more significant fraction of plane crashes today than they used to, overall there are far fewer plane crashes and fewer resulting deaths. The same positive trade-off will almost certainly be true of self-driving cars.
I'm pretty sure that most people that take the road on a single day arrive at their destination without any bad event whatsoever. Something in the order of 99.99...% and more. To me that most likely means that - statistically - people make good decisions regardless of one's opinion on «what could've happen».
This is an improper way to look at the statistics: you are ignoring how often you play the game. Your chances of winning the lottery might be 1 in a million, but if you buy half a million lottery tickets, your chances of winning are 50%. In other words, your individual odds of dying on any particular car ride are very small, but you take a lot of car rides, so your annual or lifetime risk is fairly significant.
It might sounds cliché, but the only way you know there are good decisions, it's because there are bad decisions.
Well that's just silly. You don't need to be a genius to know that running a red light is dumb/dangerous and you don't need to test it either. I don't need to actually [chooses random object in field of view] pull a curtain rod off my wall and stab myself with it to know that would be a dumb thing to do. Humans are plenty smart enough to weigh decisions they have never taken.
 
Last edited:
  • Like
Likes Stavros Kiri
  • #68
Orodruin said:
That is an illusion based on most potentially fatal mistakes actually not being fatal. That you arrive safely is correlated, but not equivalent to you driving safely and taking good decisions. For example, failing to pay proper attention when driving across a seldomly crossed zebra crossing. This is a mistake that is going to go by completely unnoticed until it doesn’t. It does not make the mistake any less of a mistake.
I live in a city where there is a zoo. About 50 years ago, a lion escaped and spread terror to the point where it was shot to death. There is still a zoo today, they still have lions. Is it a mistake on my part to not check for lions on my porch before getting out of my house?

Making decisions is all about probability and, yes, not paying attention when driving across a seldomly crossed zebra crossing is NOT a mistake, from my point of view. The proof lies in the results. But the chances are always there and the hit is inevitable given time. That is why I say about that accident «That person was unlucky, it usually doesn't happen» and not to the thousands of other people who did not have an accident «You were lucky, you could've hit a zebra!» I can assure you that this is how AI would make its decisions as well.
Orodruin said:
When it comes to governing, I believe the last few years have clearly shown that a main flaw in democracy is that people are easy to influence with feelings and emotions based on false or invented facts. If sufficiently advanced and benevolent, I would be prepared to handle government over to an AI.
That is scary. The solution to that problem is to raise people that can make better decisions, not to replace them with machines. Is your solution to people not well educated, replacing them with machines that have better knowledge? Humans are NOT a lost cause. Otherwise there is no point keeping humans alive.
Orodruin said:
You are talking about people handing over their decisions to a machine, removing their own decisions, but the truth is that many people already hand many of their decisions over to others. The only difference being that these others happen to be other humans (most of the time).
That is my point, we are on a dangerous path. One where the common man is seen as an unfit animal, unable to care for itself. I don't believe that. I always felt that we should go towards having more people being able to make decisions in all their life aspects and thus contributing to the society in general, not just waiting for someone (or something) else to decide. That is what democracy is.
Orodruin said:
The way that you weight decisions is by predicting and weighting outcomes against each other. You do not need to do it to know that hitting on 20 when the dealer shows a 6 is a bad decision.
Making decisions is often way more complicated than that. Ask the people of Florida if they should evacuate or not when an hurricane is announced. Not an easy decision to make. How many times are you going to evacuate the entire state «for nothing» before you won't? And when you won't do it, it may be the time you should have. Welcome to life. Can AI do better? I don't think so. The way nature does it is by diversity: Some go, some stay, at least one group survives. The «good» decision is unpredictable.
Orodruin said:
I would argue that saving human lives is a good reason.
No lives are ever saved. The best you can do is extend one. On the greater scheme of things, I still fail to see what improvement it does to a form of life, human race or any other. I guess it is these «feelings and emotions» that you were talking about that influences you. I wonder if you would appreciate a machine making decisions for you with that cold and objective attitude? After all, I'm a human being and I already have those thoughts. You better hope I won't be the programmer behind the next generation of AI.
russ_watters said:
Mostly what we want machines to do for us are the things that are too hard or boring or dangerous. It doesn't get in the way of us living, it frees us to do the living we really want to do.
Oh! I love this one! What is it we really want to do? What if what I really want to do is driving? Will I be allowed or will I be forbidden to do it because it is considered too dangerous by many, too afraid I will kill them? Will I be considered abnormal for thinking driving is fun and not boring? What I'm I supposed to really want to do?

Have you noticed that the more we find things boring, the more we find the things we replaced them with, boring as well?
 
  • #69
russ_watters said:
But heck, I bet the first few times the engineers ran simulations of the accident after the fact, the computer reported to them that no accident happened.

Star Trek - Season 4 Ep 5 "Remember Me"

Tesla, what was that noise?
Explosive decomposition of passenger compartment and passenger.
Cause?
Design Flaw: there appears to be no structure above the mirrors.
 
  • Like
Likes Stavros Kiri and russ_watters
  • #70
jack action said:
Oh! I love this one! What is it we really want to do? What if what I really want to do is driving?
Is that true or are you just asking for the sake of argument? We're not discussing an abstract hypothetical here, we're discussing a likely near-term reality about a daily task people do that most would rather not. Really. As @Greg Bernhardt said earlier, I can't wait for the time when I don't have to do the mind-numbing task of driving to and from work in traffic for an hour+ a day. I'd much rather be taking a nap, watching TV, reading a book, using PF, etc.
Will I be allowed or will I be forbidden to do it because it is considered too dangerous by many, too afraid I will kill them?
1. There is a wide gulf between "possible" and "mandatory". For a very long time - perhaps until after we're both dead and perhaps forever - self-driving will be a feature we can choose to buy and use or not. Like a washing machine. I would bet money that most who can afford it will choose to buy it (and get rewarded with lower insurance rates). So I don't think the scenario you present (of self-driving being mandatory) is a reflection of any potential reality that is on the table.
2. I like driving too. Sometimes. In certain contexts. So perhaps instead of commuting to work (boring, stressful), I'll go to a test track on weekends and scratch the itch that way.
Will I be considered abnormal for thinking driving is fun and not boring? What I'm I supposed to really want to do?
That's meaningless and irrelevant here. Mature, free thinking adults don't have to ask if what they want to do is considered "abnormal". The question is simply what do you and most people want.
Have you noticed that the more we find things boring, the more we find the things we replaced them with, boring as well?
No, I haven't. That's just silly. Why would I choose to do something more boring if there was something less boring and more enjoyable I could do? I don't go thinking to myself: "Hmm...should a watch a movie right now or not turn on the TV and stare at the blank screen?"
 
  • Like
Likes Orodruin
  • #71
jack action said:
Making decisions is all about probability and, yes, not paying attention when driving across a seldomly crossed zebra crossing is NOT a mistake, from my point of view.
This is just silly. Lions do not cause tens of thousands of deaths in the US every year. Besides, regardless of whether you consider it a mistake or not, it is actions like that that cause accidents and autonomous cars will not make them to the same extent as human drivers. (Add to that the limited cognitive abilities of humans, such as only beig able to see in one direction at a time.)

jack action said:
No lives are ever saved. The best you can do is extend one. On the greater scheme of things, I still fail to see what improvement it does to a form of life, human race or any other.
Cute. Let's stop using medicine and surgery, it does not save lives anyway. Would you accept your doctor saying ”this cyst will kill you if left untreated, but you will die from something else otherwise so we wont”. Now who is the cynic here?
 
  • Like
Likes ISamson
  • #72
jack action said:
It's really the «Bring on autonomous cars ASAP!» comment that bothers me. I don't believe it is the Holy Grail. I'm not even sure the problem it supposes to solve is that big of a problem. Most people don't make bad decisions when behind the wheel. It is even far from being a major cause of death for the human race. Statistics, right
That's the problem isn't it.
 
  • #73
Greg Bernhardt said:
A mother and her 11 month old child were t-boned and killed at an intersection a couple blocks from me yesterday.
Orodruin said:
The obvious point being made was that it would not have happened if the car was autonomous.
The obvious point being made is not obvious at all...
Unless, what you actually meant was...

It would not have happened if the mother's car was autonomous, and could have recognize the fact...
Greg Bernhardt said:
It was a drunk driver going 70 through a red in a 35mph road.
... and would have stopped her car, before the intersection collision.

Yes...
Greg Bernhardt said:
Would a machine allow that?
Both... a drunk driver going 70 through a red in a 35mph road, and stopping another car before an intersection collision, even if a light were green...

Every body seems to be obsessed with the drunk driver here, but remember... it's a "two-way street".
 
  • Like
Likes Greg Bernhardt
  • #74
Brilliant... . :thumbup:
DaveC426913 said:
Star Trek - Season 4 Ep 5 "Remember Me"

Tesla, what was that noise?
Explosive decomposition of passenger compartment and passenger.
Cause?
Design Flaw: there appears to be no structure above the mirrors.
 
  • Like
Likes Stavros Kiri
  • #75
OCR said:
Every body seems to be obsessed with the drunk driver here, but remember... it's a "two-way street".
I believe you are wrong. It would have been sufficient for the drunk's car to be autonomous. Replacing that car would have been sufficient to avoid the accident. Of course, if you replace all cars by autonomous ones, also the mother's would have been.

What would happen if you just replaced the mother's car is a different question. Would the car be able to avoid the collision? I believe the answer to that is that it would be more likely to avoid the collision than a human driver. Autonomous cars are taught to recognise and act on threats that are out of the ordinary (such as a driver running a red light) as well as obeying traffic rules themselves. The advantage in favour of the autonomous car comes from its ability to perceive the entire traffic situation, not just what they happen to be looking at.
 
  • Like
Likes Stavros Kiri and russ_watters
  • #76
I think some here are in danger of committing the Nirvana fallacy - throwing out the whole solution because it is not perfect.

There are surely some specious scenarios that an automated system won't be able to handle. Some of these scenarios won't be handled much better by humans.

But the key is that the vast majority of accidents that have been occurring in real life situations are avoidable by an automated system better than by a human.

OK, granted that has yet to be born out by statistics, but still, let's not make the mistake of being penny-wise but pound-foolish when it comes to accidents averted and lives saved.
 
  • Like
Likes Stavros Kiri, russ_watters and Orodruin
  • #77
I'd also argue that is is not so much about people making "mistakes"; it is more properly that the accepted pace of driving has exceeded our reaction-times.

As everyone is aware, at 70mph, a car covers 100 feet every second. Add to that perception delay, decision and reaction delay.

There's no way humans can avoid fatal accidents at this pace - even if they make no mistakes. AIs can.
 
  • #78
I voted "yes". I would like very much to vote "no", if there was a single chance that people would eventually start to respect the driving rules and take the right decisions at the right time when driving and so not let the technology take the power off our hands. But as we all know, this is impossible. Emotions are a great thing that we all humans have but - at least in most cases, not when it comes to driving. Talking specifically about my country, I think that there is a whole bunch of things to be done in order for the driving habits to get at a decent level. I'm a motorcyclist and I really have seen many poor decisions on the road for over thirty five years. Of course, I'm not perfect too - nobody is, but I can frankly say that I respect the rules and as I am in a way more vulnerable position comparing to a car, I always try to foresee what the next car's driver or the one behind me is intending to do and a whole lot of other "what if" things. So, I think that self driving cars would be a really important thing but - as has been already mentioned several times in the thread, there's a lot of work to be done in order to make me feel safer. But I believe that this time will definitely come.
 
  • Like
Likes Stavros Kiri
  • #79
QuantumQuest said:
... if there was a single chance that people would eventually start to respect the driving rules and take the right decisions at the right time when driving...
Problem is, it's worse than that.

The skill required for driving will only go up, as speeds increase, traffic gets ever heavier, and more roads are laid.
So, even if drivers suddenly started operating at required levels today, it's a losing battle, as it's a moving target.
It is already exceeding our current abilities, and its only going to require more each passing year.I'm a die-hard driver. So I'm a dinosaur. But I do see a day when traffic will ease, as more and more personal cars are replaced on our roads with a combination of automated cars, bicycles and better mass transit solutions (such as taxi-like services).
 
  • Like
Likes QuantumQuest
  • #80
russ_watters said:
No, I haven't. That's just silly. Why would I choose to do something more boring if there was something less boring and more enjoyable I could do? I don't go thinking to myself: "Hmm...should a watch a movie right now or not turn on the TV and stare at the blank screen?"
I think he* probably means that it is an inclination (tedency) from our inside (or something, etc.), if we are kind of saturated and we find more and more things boring ...

* I am referring to @jack action

(
jack action said:
Have you noticed that the more we find things boring, the more we find the things we replaced them with, boring as well?
)
 
Last edited:
  • Like
Likes jack action
  • #81
russ_watters said:
Really. As @Greg Bernhardt said earlier, I can't wait for the time when I don't have to do the mind-numbing task of driving to and from work in traffic for an hour+ a day. I'd much rather be taking a nap, watching TV, reading a book, using PF, etc.
russ_watters said:
... using PF ...
:thumbup::smile:

+ time is valuable, although driving is good too, but traffic jams etc. are just a waste of time! ...
Autonomous cars are perfect for time-overlapping, taking care of tasks and business, as well as science and PF, while driving, ... We'll have more PF posts then! ...
 
  • #83
Ivan Samsonov said:
https://phys.org/news/2017-10-self-driving-cars-future-mobility-disabled.html

However driverless might be good for disabled people.
The way humans make mistakes and cause accidents, we all act like disabled sometimes! ... (e.g. when very tired or exhausted) [Of course humans and humanity keep improving! - we are a good promising species! ...]
So good for everyone too(?) ...
 
  • #85
Good thing the company isn't Wham-O!
 
  • Like
Likes Greg Bernhardt
  • #86
Sorry if this is OT and a new post could be made, but, is safety the only issue one should consider re self-driving cars? No one seems to want to bring the ugly side that comes with the glamour of "disrupting" : the people who end up displaced. What about taxi drivers, bus drivers, etc?
 
  • #87
It's the truck drivers fault by law but if a human driver did what the Driverless Shuttle did we would call him a brainless deer in the headlights idiot from not listening to the passengers screaming for him to avoid the backing truck.

https://www.huffingtonpost.com/entry/driverless-shuttle-hit-by-truck_us_5a0371bfe4b03deac08af3db
Passengers onboard the vehicle confirmed that the shuttle did stop ― but in the path of the truck.

“The shuttle just stayed still and we were like, ‘Oh my gosh, it’s going to hit us, it’s going to hit us!’ and then, it hit us!” passenger Jenny Wong told KSNV. “And the shuttle didn’t have the ability to move back, either. Like, the shuttle just stayed still.”
 
  • #88
WWGD said:
No one seems to want to bring the ugly side that comes with the glamour of "disrupting" : the people who end up displaced. What about taxi drivers, bus drivers, etc?
I am always confused by this sentiment. Every new innovation threatens the old way. Do we stop progressing?
 
  • Like
Likes Stavros Kiri and russ_watters
  • #89
Greg Bernhardt said:
I am always confused by this sentiment. Every new innovation threatens the old way. Do we stop progressing?

Yes, but now jobs are being disrupted at an ever increasing rate. This will not be the same scale of job disruption as in the past. I see the potential for a serious crap storm in the near future if steps are not taken.
 
  • #90
WWGD said:
Sorry if this is OT and a new post could be made, but, is safety the only issue one should consider re self-driving cars? No one seems to want to bring the ugly side that comes with the glamour of "disrupting" : the people who end up displaced. What about taxi drivers, bus drivers, etc?
The fact that a taxi driver won't be able to drive a taxi anymore only means he will have more time to find a cure for cancer or increase the efficiency of solar panels.

If a person lives, it must create a demand for something. If there is a demand, there is work. That is why technology or immigration will never create a job shortage. Ever.

But I'll admit that it can be difficult to change old habits and break a well established routine.
 
  • Like
Likes Stavros Kiri
  • #91
Greg Bernhardt said:
I am always confused by this sentiment. Every new innovation threatens the old way. Do we stop progressing?
Agreed. I think the fact that the job losses are specific and complete adds visibility that makes the downside seem worse -- as opposed to, say, loss of secretarial jobs to PCs, which was a fraction of a larger pool. But I don't consider the disruption worse if it eliminates 10 million out of 10 million jobs vs 10 million out of 50 million (made up numbers for illustration). It's still 10 million people who need to find new jobs.

...The one caveat I'd put on that though is if it is 10 out of 50, you may have a chance to keep your job via good performance, whereas if it is 10 out of 10, you will lose your job no matter how good you are at it. But when it comes to unskilled labor, there isn't really such a thing as "being good at it".

There is ongoing debate in the US about job skills: in an open thread right now, a user is arguing we need more higher education including a degree above phd. On the other end is Mike Rowe who in effect is arguing fewer people should be going to college and more getting skilled blue collar work. I think there is room for a nuanced view of both (more blue collar and more usable bacherlors degrees).

As a society, the USA tends to look at the issue backwards for some reason. What the USA needs less of is non-skilled jobs like taxi drivers, burger flippers and WalMart greeters. We shouldn't bemoan the loss of these jobs, we should celebrate it! The real problem is that these jobs are "needed" at all: there are better-skilled jobs available for the taking, but there are 25 million(!) adults who lack even a high school diploma to go after something better. That's the real problem we should be focusing on (or not? Who's fault is that anyway?).

Holding back progress in order to provide unskilled work to people who haven't held-up their end of the bargain isn't something I favor: and I think more automation will help that by providing a kick to those who need it.
 
  • #92
Spinnor said:
Yes, but now jobs are being disrupted at an ever increasing rate. This will not be the same scale of job disruption as in the past. I see the potential for a serious crap storm in the near future if steps are not taken.
Do you have any references/statistics for the current state of distruption? I hear a lot of people predicting increasing future disruption (as they have - incorrectly - since the start of the industrial revolution), but I don't think I've ever seen evidence of a current problem. An awful lot of people came out of the "Great Recession" pessimistic, believing "this one will be different", but despite a slow recovery we're now pretty much back to where we were during the over-inflated '90s; with a lower unemployment rate than ever in the 2000s (since 2000 itself). There are some caveats to that (part time workers and demographics shifting toward retiring baby-boomers), but I don't see anything in the data that suggests an automation-caused unemployment problem.
 
  • #93
Greg Bernhardt said:
I am always confused by this sentiment. Every new innovation threatens the old way. Do we stop progressing?
It might be that the final goal is to even, eventually, virtually replace humans in difficult, risky, heavy or routine jobs with machines, robots and A.I. ... and allow humans (in a better future society) to enjoy the benefits and life as it is, or have time to pursue other more advanced and higher level quests ...
I'm OK with that! ...

People loosing jobs is an issue of course (in the transitive states of society), but no transition is easy ...
Plus more jobs are always created, as said by others.
 
  • Like
Likes russ_watters
  • #94
Greg Bernhardt said:
I am always confused by this sentiment. Every new innovation threatens the old way. Do we stop progressing?
There is a difference between incremental progress and its extreme version of disruption. Plenty of room in-between.
 
  • #95
WWGD said:
There is a difference between incremental progress and its extreme version of disruption. Plenty of room in-between.
So you are advocating that US regulators should somehow step in and slow US autonomous car research in public/private companies to give taxi drivers time to learn new skills?
 
  • Like
Likes russ_watters
  • #96
Greg Bernhardt said:
So you are advocating that US regulators should somehow step in and slow US autonomous car research in public/private companies to give taxi drivers time to learn new skills?
No, nothing nearly that radical. Just that the effects of disrupting be considered, and not just the glamorous aspect of it.
It is relatively easy to adapt to incremental changes, though not so much so to change careers when disruption happens. That's all (folks).
 
  • #97
WWGD said:
No, nothing nearly that radical. Just that the effects of disrupting be considered, and not just the glamorous aspect of it.
It is relatively easy to adapt to incremental changes, though not so much so to change careers when disruption happens. That's all (folks).
I hear you and I think they are being considered. A Google search reveals that, but the problem is that the market does not accept "unnecessary" incremental steps. The market wants the best and cutting edge all the time when it's possible. Thus regulators would be needed to enforce incrementalism. The problem with that is then the US market suffers when other countries don't follow suit and advance past.
 
  • #98
jack action said:
The fact that a taxi driver won't be able to drive a taxi anymore only means he will have more time to find a cure for cancer or increase the efficiency of solar panels.

If a person lives, it must create a demand for something. If there is a demand, there is work. That is why technology or immigration will never create a job shortage. Ever.

But I'll admit that it can be difficult to change old habits and break a well established routine.
How many taxi drivers have degrees and/or advanced knowledge of (Solar) engineering, biotech? EDIT I am not saying it is impossible to make the transition but that something must be made to facilitate it, it is not likely to happen without special programs.
 
  • #99
WWGD said:
How many taxi drivers have degrees and/or advanced knowledge of (Solar) engineering, biotech? EDIT I am not saying it is impossible to make the transition but that something must be made to facilitate it, it is not likely to happen without special programs.
Most have some knowledge to repair or maintain cars. E.g. they can contribute to self-driving car maintainance, an important issue, IMO.
 
  • #100
Stavros Kiri said:
Most have some knowledge to repair or maintain cars. E.g. they can contribute to self-driving car maintainance, an important issue, IMO.
True, good point, although many modern cars have become too complicated, computerized, requiring very specialized knowledge which older cars did not require. Besides, today remaining up-to-date and improving one's education is much easier by just having an internet connection.
 
  • Like
Likes Stavros Kiri
Back
Top