Do you feel safer with self-driving cars on the road?

AI Thread Summary
The discussion centers on the safety perceptions of self-driving cars compared to human drivers. Participants express skepticism about the current capabilities of AI in anticipating complex driving situations, emphasizing that while self-driving cars may statistically reduce accidents, they are not yet widespread enough to enhance overall safety. Concerns are raised about the limitations of sensors and the unpredictability of human behavior, which can lead to accidents that AI may not effectively manage. Some participants look forward to future advancements in self-driving technology, believing that with time, these vehicles could significantly improve road safety. Ultimately, the consensus leans towards cautious optimism, with many agreeing that while self-driving cars may be safer in theory, they do not yet feel comfortable relying on them.

Do you feel safer with self-driving cars on the road?

  • Yes

    Votes: 31 41.3%
  • No

    Votes: 37 49.3%
  • No opinion

    Votes: 7 9.3%

  • Total voters
    75
  • #151
jack action said:
I understand your fear about a human driving a car. But what makes you think it is better with AI?"
The AI respect of the traffic laws. A quicker and better judgement than any human. Its face recognition would determine the age of anyone nearby and in case it is involved in a huge accident mess, make better judgements to save lives. I.e. let live the younger people, and possibly female ones. The life of a 13 years old boy is worth more than a 113 years old lady, at least according to the AI, would make sense to me. I know this is highly debatable and what not, I'm just giving my subjective opinion here. To me this is much better than a human who doesn't have time to think on who to save, who to kill, because of the lack of time to think.

jack action said:
If you would have almost been hit intentionally by a car driven by AI 2 days ago, or that your aunt would have been in an accident involving AI malfunction instead of an adolescent "malfunction", would it make you feel better?
If this implies the AI did that on purpose in order to save others lives, which according to its algorithms indicates a higher priority, then yes. I would accept the result (unless I'm dead of course), but these events should be extremely rare.

jack action said:
NO matter what, a car will always be a "weapon" and the fact that "humans are pretty bad at using it" (IMHO, with the many millions trips done each day around the world, I think they have a pretty good record) will never mean AI is pretty good at using it.
I agree with the sentence but I do not agree about the conclusion that over a million people worldwide die each year due to car accidents is acceptable. This is a pretty bad record. Most of these lives could have been saved.

jack action said:
One thing's for sure: AI will NEVER care about other's lives.
Why do you think so? Caring about people's lives is something that cannot be coded/programmed?
 
Last edited by a moderator:
Computer science news on Phys.org
  • #152
fluidistic said:
The AI respect of the traffic laws. A ...
Values judgements of whose life is more important leads nowhere except to a rats nest of if's and but's. It is certainly not as simplistic as encountered in the movie IRobot ( "Save the girl!" ). If one wants to give a particular AI Unit the "power" to decide the validity of termination or extension of a person's life out of several individuals in a precarious situation, then that Unit will also have to be accountable for the decision made. That accountability is through legal means for human individuals ( and other entities ) granted legal status.
The extension and granting AI Units the legal status as a person respecting the laws of the land would also have to extend to them the prosecution under the law, be it either under criminal or civil law. I doubt if we are there yet, or ready for that. AI certainly is not.

Traffic laws are put in place by human decision making, and as such are not an ultimate panacea.
Humans have decided, whether correctly or not that, that the time factor risk in their lives is of an importance ( not saying equal ) just as much as the fatality risk, ( along with many other risks ).

Even without AI, we could, right now, to reduce fatalities,
Reduce maximum speed limits to such an extent that any collision is just a fender bender.
Put Stop signs at every intersection.
Put a traffic cop at every intersection.
Put a helmet on every person in the car.
... etc.
We don't do that simply because we value time and money, and somewhat overlook the chances of being hurt in a car or on the road.
( Bicyclists, generally, are probably the worst on the de-valuating of the risk of being on the road, being at the loosing end of an incident be it either with a vehicle or an obstacle on the road. )

To make the commute and pleasure drive safer, generally the vehicles have added features such as seat belts and air bags and child seats for occupant protection. The vehicles themselves have added better tires, crumple zones, better lighting and all sorts of other things for handling and visibility and endurance. An AI vehicle is an extension of these safety features, not a replacement ( Or is it - if vehicles become so collision avoidant with an AI controlling all the shots, some of the other features can be compromised to lower the cost, or even entirely eliminated ie - why pay for redundant headlights if the AI doesn't need it to see at night ? or taillights - every smart intelligent AI should know the car ahead is slowing down through its Lidar and act accordingly. Minimally a few running lights to indicate presence ). All one has to do is ensure that all systems on an AI vehicle are maintained and operate 'properly' for the vehicle to be able to join the rest

Since maintenance and proper operation of electrical circuits are suspect for vehicles, and all things electrical and mechanical, at present, can the reliability of an AI vehicle be ensured to have this 100% satisfaction guarantee that things will not go wrong. Is 99% OK?
98%? How's about 80%? Well, if it's a 7 year old car, 60% should be OK. Afterall, things do get old and wear out.

Since I don't know where this is all heading and where society will place controls, I voted No. While being a passenger in a self driving car in good operating condition, can the same be said of the one next to me? Or, maybe my car will be the lemon amongst the others. Who is to tell.
 
  • Like
Likes jack action
  • #153
256bits said:
Even without AI, we could, right now, to reduce fatalities,
Reduce maximum speed limits to such an extent that any collision is just a fender bender.
Put Stop signs at every intersection.
Put a traffic cop at every intersection.
Put a helmet on every person in the car.
... etc.
The problem is that some humans do not respect the traffic laws. If you plug a camera on a rather highly frequented road, it won't take much time until an unpunished infraction is comitted. So even though it's correct to assert that even without AI we could reduce fatalities using such severe methods, an AI would outperform us. And ultimately that is what we care about. Better let humans manipulate weapons they are bad at, or let a very sophisticated AI do that job?
 
  • #154
fluidistic said:
If you plug a camera on a rather highly frequented road, it won't take much time until an unpunished infraction is comitted.
It is not because you committed an infraction that you did something dangerous. The facts will show that most infractions committed (anyway, the kind you refer to), don't cause any accident. If the contrary was true, nobody would do it. People go over the speed limit because you don't have an accident every time you do. People don't drive on the wrong side of the road because a terrible accident is almost guaranteed in such a case.

And that is want I don't understand about this way of thinking: The punishment for such actions is having an accident that results in death or injury. So why punish someone when nothing happens? Nobody in his right mind will willingly do something to hurt himself. But there are risks with every decision. Anyone should be entitled to assess the risks that goes with their actions. And, yes, our actions have an impact on our surroundings. But people around us should adapt their decisions accordingly. As a society, we should educate kids about how to evaluate those risks, and the best method is leading by example. Once they've learned a way, you have to accept it and deal with it, even if it wasn't what you expected of them. At worst, you should consider yourself a bad teacher, not them as bad students.

I don't like having my life guided by the fears of others. You want a car driven by AI because you think it makes better decisions than you? I respect that decision and I might follow your path. You want your neighbors (who you might not even know) to use driverless cars because you don't trust they can drive a car correctly: I think you stepping your bounds. Maybe you're right and one could implicate you in a terrible accident but, from my point of view, your only option is to thank God you have a super smart car that can identify human-driven cars and avoid crossing their path for your protection.

If I'm afraid my neighbor will steal from me, I get an alarm system, I hire a security guard, I buy a dog. What I don't do is asking everyone to wear an ankle bracelet that will give them an electric shock if they go to my house without my consent; You know, because they are thieves everywhere and so why take a chance? If they don't come to my house, they won't get electrocuted, so what's the harm? Do you see the difference between controlling what you do and what others do? In a society where everyone is equal, you cannot act as if you are better than others.

You should have the right to act as you see fit with your surroundings, not to choose how others act around you.

Oh yeah, and if you also think that there is such a thing as an infallible law, especially one made by humans, you will be deceived sooner or latter.
 
  • Like
Likes OCR
  • #155
jack action said:
It is not because you committed an infraction that you did something dangerous. The facts will show that most infractions committed (anyway, the kind you refer to), don't cause any accident.
I agree with you.

jack action said:
If the contrary was true, nobody would do it. People go over the speed limit because you don't have an accident every time you do. People don't drive on the wrong side of the road because a terrible accident is almost guaranteed in such a case.
I do not agree with you. From an AI point of view, people do go over the wrong side of the road and terrible accidents that could be prevented happen. They are rare but not unseen. AI would not do that.
jack action said:
So why punish someone when nothing happens?
Because if the law is not respected, it is useless. You would like to punish a man who opens fire in the streets in a crowded place but doesn't harm anybody by chance, right? That's a pretty good example of why punishing someone who breaks the law but doesn't harm anybody makes sense.

jack action said:
Nobody in his right mind will willingly do something to hurt himself.
The problem is, not everybody is in its right mind. All AI could be in their right mind.

jack action said:
I don't like having my life guided by the fears of others. You want a car driven by AI because you think it makes better decisions than you? I respect that decision and I might follow your path. You want your neighbors (who you might not even know) to use driverless cars because you don't trust they can drive a car correctly: I think you stepping your bounds. Maybe you're right and one could implicate you in a terrible accident but, from my point of view, your only option is to thank God you have a super smart car that can identify human-driven cars and avoid crossing their path for your protection.
No. I want a driving car to drive me because it is safer. The fact that it makes better and quicker decision than me, I do not care about. I want my neighbhor to do the same because they are a threat to the society. Just like with the case of anti-vaxers who refuse to vaccine their children. It is well known that this harm people who cannot be vaccinated or have very low defenses against illnesses. These anti-vaxers are a threat to the society and it is a good thing that the law obligate them to vaccinate their children.
If cars were only a threat to those who drive them, I would have 0 problem for human drivers. But again, they are a threat to innocent people who aren't even in the car, that is the big problem that AI could tackle better than humans.

jack action said:
Do you see the difference between controlling what you do and what others do? In a society where everyone is equal, you cannot act as if you are better than others.
I do not understand the "you are better than others" part. You live in a society, there are rules not to harm others, this is the only "restrictions" that you're obligated to follow. This does not go far into "controlling what others do."
jack action said:
Oh yeah, and if you also think that there is such a thing as an infallible law, especially one made by humans, you will be deceived sooner or latter.
I do not think there are infallible laws. Nor do I think that AI could reduce the number of deadly accidents to 0. I do believe, however, that they could do a better job than humans at reducing car accidents, which is the topic of this thread.
 
  • #156
fluidistic said:
I do not think there are infallible laws. Nor do I think that AI could reduce the number of deadly accidents to 0. I do believe, however, that they could do a better job than humans at reducing car accidents, which is the topic of this thread.

That's not exactly the topic of this thread. The topic of the thread is the present tense, not some hypothetical future possibility.

The distinction is important, because between the present and the future is a long transition.

My wife and I earn our living in consulting. Since the downturn in DoD funding in 2012, most of that consulting work is in legal cases involving injury, and most of those injuries are in vehicle accidents. Two trends stand out to us: 1. Most serious accidents are due to inattentive or impaired drivers. 2. Most seven figure legal cases involve older vehicles: 7-15 years old. So even if self-driving cars solve the problem of inattentive or impaired drivers (for those who use them), they won't be a great benefit to general auto safety until all (or the great majority of) those older vehicles (not self-driving) are off the road.

One issue that comes to mind relative to the long transition is the cost. Compared with the range of costs associated with owning and driving traditional vehicles, what is the range of costs associated with owning and driving self-driving vehicles? Requiring every driver to pay these increased costs is going to significantly reduce the number of people who can drive - it will reward the rich and punish the poor.

There are likely better paths to improved safety that don't force millions of drivers off the road. Since insurance is already a universal requirement, the insurers can use relatively inexpensive technologies (a hundred dollars or so) to record driving habits and raise rates for inattentive, unsafe, and impaired drivers. Higher insurance rates will have the effect of positive behavior modification which will achieve improved safety. Right now the lead times are too long with insurance companies waiting until a driver has tickets and accidents to raise their rates or cancel their policy. These technologies can be made universal in 2-5 years; whereas, you're looking at at least 20 years before self-driving cars would be universal in a place like the US.
 
  • #157
what would happen if there was a malfunction in the self-driving car?
 
  • #158
Dr. Courtney said:
One issue that comes to mind relative to the long transition is the cost.

You should factor in, the price of technology will likely come down big time, tens of thousands of lives saved, what is the value of that, reduction of hundreds of thousands of injured cars and people, again what is the value of that, reduced insurance premiums. And for a while they will keep making dumb cars for those who can't afford self driving cars just as they keep making flip phones. I am guessing many poor have smart phones.
 
  • #159
fluidistic said:
From an AI point of view, people do go over the wrong side of the road and terrible accidents that could be prevented happen. They are rare but not unseen. AI would not do that.
You show humans as full of flaws and AI as some model of perfection. From the manufacturer's point of view:
http://www.bbc.com/news/technology-36783345 said:
The latest crash, near Cardwell, Montana, saw a Model X car swerve to hit wooden rails next to a two-lane road.

"This vehicle was being driven along an undivided mountain road shortly after midnight with autosteer enabled," a spokeswoman told the BBC, referring to autopilot's steering function.

"The data suggests that the driver's hands were not on the steering wheel, as no force was detected on the steering wheel for over two minutes after autosteer was engaged - even a very small amount of force, such as one hand resting on the wheel, will be detected.

"This is contrary to the terms of use that are agreed to when enabling the feature and the notification presented in the instrument cluster each time it is activated.

"As road conditions became increasingly uncertain, the vehicle again alerted the driver to put his hands on the wheel.

"He did not do so, and shortly thereafter the vehicle collided with a post on the edge of the roadway.

"Autosteer... is best suited for highways with a centre divider.

"We specifically advise against its use at high speeds on undivided roads."
Even the manufacturer doesn't trust its vehicles on undivided roads and prefer the full attention of a human being instead. AI is not as magical as you tend to present it.
fluidistic said:
I want my neighbhor to do the same because they are a threat to the society.
fluidistic said:
These anti-vaxers are a threat to the society
They are not threats to our society, they are members of our society. And when members of our society need help (drug or alcohol problems, trust or self-confidence issues, mental problems, etc.), we shouldn't punish them or impose more obligations on them (like owning a driverless vehicle), we should help them. Otherwise, the whole concept of society doesn't make sense, nobody would join. Yes, their actions may cause harm to society, but we should see this as our problems, not them vs us. Self-driving vehicles will not resolve the drinking problem of an alcoholic, which has impact in areas other than driving. Solving (or preventing) the drinking problem, solves every problem.
 
  • #160
jack action said:
You show humans as full of flaws and AI as some model of perfection

Humans are as good as they are going to get, AI will just keep getting better, I don't see how that can't happen.
 
  • #161
Spinnor said:
Humans are as good as they are going to get, AI will just keep getting better, I don't see how that can't happen.
That is a gratuitous and unfounded statement for both humans and AI.

What is the faith of human being: Create AI to replace humans? To what end would humans do that? Especially if we assume humans have no respect for life and their surroundings. Why would they care?
 
  • Like
Likes russ_watters
  • #162
Spinnor said:
You should factor in, the price of technology will likely come down big time, tens of thousands of lives saved, what is the value of that, reduction of hundreds of thousands of injured cars and people, again what is the value of that, reduced insurance premiums. And for a while they will keep making dumb cars for those who can't afford self driving cars just as they keep making flip phones. I am guessing many poor have smart phones.

Extraordinary claims require extraordinary proof. Sometimes the price comes down and systems become more reliable. Sometimes not so much.

"Tens of thousands of lives saved." I've seen too much software and too many bugs, especially in real time systems, to accept this without any real proof. It certainly is a possibility, but there are other possibilities also. And there is zero proof that self-driving vehicles offer more in terms of safety in the long term than other much more affordable technological approaches such as insurers using technology to more closely monitor and raise rates (or cancel policies) on inattentive, inpaired, and unsafe drivers.
 
  • #163
jack action said:
That is a gratuitous and unfounded statement for both humans and AI.

It seems obvious. Humans evolve on a scale of what, 10's of thousands of years and technology evolves every year. And are evolving as better drivers? I doubt it. Well we could spend more time teaching people to drive better but many humans are selfish which leads to many accidents. And on the few occasions where I have had to drive to and in New York city, what a stressful nightmare. I hate it. My brain gets overloaded.
 
Last edited:
  • #164
Dr. Courtney said:
Extraordinary claims require extraordinary proof.

I am sorry but I think my claims are extraordinary.

Dr. Courtney said:
I've seen too much software and too many bugs
We will just have to agree to disagree. It seems every task we pit humans against computers and robots they eventually better us. And when it comes to self driving cars we are obviously in the infancy of this technology. I don't see an obstacle in the near term, say 10 to 20 years to self driving cars bettering me, an average driver (below average at night). There is just too much money to be made with this technology.
 
Last edited:
  • #165
Way too many posts on this topic.
Who or what should be penalized when a self-driving vehicle violates a traffic or safely rule?
 
  • Like
Likes Dr. Courtney
  • #166
Spinnor said:
I don't see an obstacle...
It doesn't look like that Model X did either...
BBC News said:
The latest crash, near Cardwell, Montana, saw a Model X car swerve to hit wooden rails next to a two-lane road.

You made a good pun, though... :biggrin:
 
  • Like
Likes Dr. Courtney
  • #167
symbolipoint said:
Way too many posts on this topic.
BBC News said:
...the vehicle collided with a post...
You made a good pun, too... :biggrin:
 
  • Like
Likes Dr. Courtney
  • #168
Spinnor said:
It seems obvious. Humans evolve on a scale of what, 10's of thousands of years and technology evolves every year.
Humans have been evolving for millions of years, technology is only a few decades old. Everything goes fast once a «key» has been found, but it stabilizes rapidly. Compare an airplane from 1905 with one from 1930, they are nothing alike. Then compare airplanes from today to ones from 25 years ago: Not many differences.
Spinnor said:
It seems every task we pit humans against computers and robots they eventually better us.
Are there so many computers that make better decisions than humans right now? Sure, they are tools build by humans to give them leverage to execute some tasks, like a hammer that multiply the impact force that any human arm can make or an autonomous engine that can produce more work any human "ask" it to do. Computers are only another man-made machine that can calculate very fast when a human "ask" it to do it, nothing more. To my knowledge, nobody thinks an engine is better than a human being.

A human can build wonderful things with powerful machines he conceived, like bulldozers and such. But with a wrong decision, he can destroy in incredible ways as well. With computers, he can make good decisions faster by analyzing already stored decisions or protocols, but he will also be able to make bad decisions faster too. Computers don't make decisions and they have no morals, people who build and use them do.
Spinnor said:
There is just too much money to be made with this technology.
Amen to that.
symbolipoint said:
Way too many posts on this topic.
Who or what should be penalized when a self-driving vehicle violates a traffic or safely rule?
Well just read a few posts before yours and you'll find out that apparently AI is the only human-made machine that will be perfect, won't have bugs and will answer correctly every moral dilemma the human kind could face. Therefore your question is irrelevant. :wink:
 
  • #169
jack action told me:
Well just read a few posts before yours and you'll find out that apparently AI is the only human-made machine that will be perfect, won't have bugs and will answer correctly every moral dilemma the human kind could face. Therefore your question is irrelevant. :wink:
NO. Not irrelevant! You remember the H.A.L. 9000? You really believe human engineering will make such a perfect machine?
 
  • #170
symbolipoint said:
You really believe human engineering will make such a perfect machine?
Well... HAL did. . :wink:
 
  • #171
symbolipoint said:
jack action told me:

NO. Not irrelevant! You remember the H.A.L. 9000? You really believe human engineering will make such a perfect machine?
Tongue in cheek it was that response think I.
 
  • Like
Likes jack action
  • #172
OCR said:
You made a good pun, though... :biggrin:

What is important is accidents per 1000's of miles driven, self driving cars can only improve. Humans are stuck at sucky. Self driving cars in accidents will get a lot of press. The hundreds of people who died in auto accidents over the past week won't.

Went for a delightful bike ride at sunset. Part of the ride took me on a new bike path build next to a new bypass highway. In someones infinite stupidity they erected a wooden post and rail fence instead of a guard rail next to the road. It is quite a joke and reminds me every time I drive the road how bad drivers are. In the 8 mile stretch of the road there must be 50 or more places where humans could not keep their car on the road and took out sections of the fence. Self driving cars are not a matter of if but when.
 
  • #173
jack action said:
Compare an airplane from 1905 with one from 1930, they are nothing alike. Then compare airplanes from today to ones from 25 years ago: Not many differences.

Yes, so self driving cars at first will suck just like the early airplanes did. In time they will be nearly perfected. But I don't think humans are getting any better at driving on average, machines on the other hand will get better. How good do self driving cars have to get to make everyone here feel good about them? Would you rather have some drunk flying down the road towards you or a self driving car. Some here they sound like they would rather take their chances with the drunk.
 
  • #174
Spinnor said:
What is important is accidents per 1000's of miles driven, self driving cars can only improve. Humans are stuck at sucky. Self driving cars in accidents will get a lot of press. The hundreds of people who died in auto accidents over the past week won't.

Went for a delightful bike ride at sunset. Part of the ride took me on a new bike path build next to a new bypass highway. In someones infinite stupidity they erected a wooden post and rail fence instead of a guard rail next to the road. It is quite a joke and reminds me every time I drive the road how bad drivers are. In the 8 mile stretch of the road there must be 50 or more places where humans could not keep their car on the road and took out sections of the fence. Self driving cars are not a matter of if but when.
Spinnor said:
Yes, so self driving cars at first will suck just like the early airplanes did. In time they will be nearly perfected. But I don't think humans are getting any better at driving on average, machines on the other hand will get better. How good do self driving cars have to get to make everyone here feel good about them? Would you rather have some drunk flying down the road towards you or a self driving car. Some here they sound like they would rather take their chances with the drunk.
Too many people are already employed to drive. The driverless vehicles will mean many unemployed truck and taxi drivers.
 
  • #175
Spinnor said:
Yes, so self driving cars at first will suck just like the early airplanes did. In time they will be nearly perfected.

Perhaps, but the question is in the present tense, and I don't see self-driving cars that everyone can afford getting where they need to be for 20-50 years. Lots of poor folks driving cars worth from $500-$2000. Are self-driving cars going to ever be in that price range (or the inflation adjusted equivalent)?

Spinnor said:
But I don't think humans are getting any better at driving on average

The issue in safety is the lower tail of the distribution, not the average. If insurance companies can cut off the lower tail of the distribution or force them to get better through monitoring and higher premiums, there can be significantly fewer accidents.

Spinnor said:
, machines on the other hand will get better. How good do self driving cars have to get to make everyone here feel good about them?

Why not have the machine improvements advising the good drivers and reporting the bad drivers to authorities to get them off the road? Why do the machines need control over the vehicles?

Spinnor said:
Would you rather have some drunk flying down the road towards you or a self driving car. Some here they sound like they would rather take their chances with the drunk.

This would only work if self-driving cars are mandated by law so that all the careless drivers have them. But wait, if the careless drivers obeyed the law, there would be no drunk drivers. You want a new law to solve the problem that people are not obeying an existing law.

When traditional cars are outlawed, only outlaws will have traditional cars.
 
  • Like
Likes symbolipoint
  • #176
symbolipoint said:
Too many people are already employed to drive. The driverless vehicles will mean many unemployed truck and taxi drivers.

I already posted about this, I worry there is great potential for civil strife as a rapidly increasing number of people loose their jobs due to technology. One reason I am sure self driving cars will soon be a reality is there is much money to be made for the companies that can produce and use this technology. Just like the threats of global warming we need to think about the future implications of this technology now.
 
  • #177
Dr. Courtney said:
Perhaps, but the question is in the present tense, and I don't see self-driving cars that everyone can afford getting where they need to be for 20-50 years.

When I entered college only 26 years ago I got one of the first PCs, a Sinclar something or other. In the short span of 26 years we are now talking about self driving cars, super computers that can predict weather many days in advance, planes that can land themselves, and a IBM computer that beat a human at Jeopardy.

Dr. Courtney said:
The issue in safety is the lower tail of the distribution, not the average. If insurance companies can cut off the lower tail of the distribution or force them to get better through monitoring and higher premiums, there can be significantly fewer accidents.

I think the average driver makes one or more claims in their lifetime. I am sure it is the average driver that causes the bulk of insurance claims.

Dr. Courtney said:
Why do the machines need control over the vehicles?

I see them as assisting us, as for example the cars that are advertized that will brake or steer if you get distracted. Cars that assist us are already here!. They will just get better. So on a trip to NYC when I get near the city I will ask the car to take over because I don't like city driving.
 
  • #178
Dr. Courtney said:
This would only work if self-driving cars are mandated by law so that all the careless drivers have them. But wait, if the careless drivers obeyed the law, there would be no drunk drivers. You want a new law to solve the problem that people are not obeying an existing law.

When traditional cars are outlawed, only outlaws will have traditional cars.

At some point self driving cars will be deemed safe enough that those who can afford them will buy them. In time self driving technology may be required on all cars but that will come much later. Just before the manufacture of smart cars is mandated there may a rush to buy dumb cars and their value might go up significantly? That way the outlaws can have their dumb cars. Dumb cars will be around for a long time, just look at the people who collect Model T Fords.
 
  • #179
Spinnor said:
Would you rather have some drunk flying down the road towards you or a self driving car.
I really don't care as I never noticed crossing neither, even though I'm sure I have. If I've crossed path with drunk drivers (and I'm sure I did), there was no noticeable erratic driving. My point is not that it never happens, just that driving impaired (just like not following the law to the letter) doesn't mean an automatic accident, like some people want us to think. The human laws are not based on anything scientific (physics); at best, only a bunch of statistics compilations analyzed by scared humans on a mission.

Although it seems a lot of people on this thread risks their life everyday on the road, I never felt like that. I went on public road in cars, small trucks, motorcycles, bicycles, by foot and public transit, and I was warned to watch for inattentive or irresponsible people that apparently needs special attention (when in a car, it's cyclists & pedestrians, when on a motorcycle or as a pedestrian, it's cars & trucks, etc.) and I never noticed anything special and I never had to put extra care for anyone. People are people, I know what to expect, I learned to deal with the not so unexpected, and I act with confidence (which makes a huge difference in your relationship with others on the road).

I've been with drivers that were pointing what they thought was bad behavior - "Did you see, he cut us!" - and I always was baffled as I never seen anything that was unexpected or required applying the brakes. This is what I mean by having his life driven by fear. I'm afraid (read terrified) of some stuff too, just not this. And I certainly don't want to live my life according to other people's fear. If driving is such a terrible experience for you, just don't do it (please, don't tell me you have to, nobody does).

The goal of anyone should be to get rid of those fears that set you in panic mode, not to nourish them.
 
  • #181
jack action said:
If driving is such a terrible experience for you

I'm not in fear of driving. :smile: I drive slow, try to be careful, avoid driving at night if I can, avoid city driving if I can. Knock on wood, have not had an accident since I was a teenager. I have said that I worry while blinded by oncoming traffic I will not see someone dressed in dark and walking too close to the road. My work requires me to drive a work truck around and since I work for myself, yes I have to drive. I am only fearful for my childrens future.
 
  • #182
Dr. Courtney said:
When traditional cars are outlawed, only outlaws will have traditional cars.
:DD
 
  • #183
Honestly I would feel safer. Not that I trust them, but I would trust them more than people. I live on a highway and when it snows, I shovel my sidewalk. I've almost been killed multiple times by people veering off onto the side of the road because they're looking into their lap to their cell phones. Whenever I'm driving, I'll sometime glance at the person next to me, and probably 8 out of 10 times they're messing with their cellphones. It's a real dirt bag thing to do because it puts others at risk.
 
  • #184
Spinnor said:
Knock on wood, have not had an accident since I was a teenager.

Did though have some close calls in city driving. My luck may be running out.
 
  • #185
Spinnor said:
Brace yourselves, its here.

"Uber to buy 24,000 specially-adapted Volvos in bid to develop fleet of driverless cars"

24,000 cars at over $48,000 each for paid transportation services may not be the first step toward widespread adoption. In any case, for me, it's there, not "here" because I am unlikely to drive/ride in any of the cities served.

Uber is interested in this technology, because it is cheaper for a company not to pay human drivers, not because it is safer. We're still a long. long way from this uncertain and expensive technology being widely adopted by the vast majority of vehicle owners who do not pay a driver and who prefer not to spend (of cannot spend) $48,000 on a car.
 
  • Like
Likes jack action
  • #186
Dr. Courtney said:
We're still a long. long way

$50 bucks says that in 7 years self driving cars will have an accident per thousand miles traveled ratio better than legal American drivers taken as an average.

Will you take me up on the bet assuming I have made it precise enough for your liking? If you want, take drunks out of the equation, they should not be driving.
 
  • #187
Spinnor said:
$50 bucks says that in 7 years self driving cars will have an accident per thousand miles traveled ratio better than legal American drivers taken as an average.

Will you take me up on the bet assuming I have made it precise enough for your liking? If you want, take drunks out of the equation, they should not be driving.

Comparing with the American average (in anything) is a pathetically low bar.

$100 the accident rate of self-driving cars is higher than the rate for my wife and I.

$100 that the average price for self-driving cars is at least $5000 higher than for traditional, driver-controlled cars.

$100 that the installed base is less than 10% of private passenger cars in the US.
 
  • Like
Likes jack action
  • #188
Dr. Courtney said:
Comparing with the American average (in anything) is a pathetically low bar.

I don't think that is true, if self driving cars have a better overall averaged safety record then all American drivers then with self driving cars on the road there will be fewer accidents?

Dr. Courtney said:
$100 the accident rate of self-driving cars is higher than the rate for my wife and I.

The above bet won't work. Most people go many years or even their entire lives without an accident. We need the statistics of many people to make meaningful comparisons. And women are I'm guessing statistically safer drivers.

Dr. Courtney said:
$100 that the average price for self-driving cars is at least $5000 higher than for traditional, driver-controlled cars.

A quick Google search shows the above bet would be a bad one to make depending on our timeframe. On 11/21/2024, 7 years from now, I bet you $100 that the extra cost to make a self driving car will drop below $5000.

Dr. Courtney said:
$100 that the installed base is less than 10% of private passenger cars in the US.

That is a hard one and I think I will agree with your assessment above, less than 10 percent adoption of self driving cars in 7 years.
 
  • #189
Spinnor said:
I don't think that is true, if self driving cars have a better overall averaged safety record then all American drivers then with self driving cars on the road there will be fewer accidents?

The above bet won't work. Most people go many years or even their entire lives without an accident. We need the statistics of many people to make meaningful comparisons. And women are I'm guessing statistically safer drivers.
You are missing his point. If the self driving cars are above average but below the best of human drivers, why would the best drivers want to pay extra to be driven by a self driving car?

Then, once everybody realizes that, guess how many people will think they are among the best drivers ...
Spinnor said:
A quick Google search shows the above bet would be a bad one to make depending on our timeframe. On 11/21/2024, 7 years from now, I bet you $100 that the extra cost to make a self driving car will drop below $5000.
Maybe I can help you save some money. Here what the quick Google search reveals:
https://www.fastcompany.com/3025722/will-you-ever-be-able-to-afford-a-self-driving-car said:
IHS Automotive forecasts that the price for the self-driving technology will add between $7,000 and $10,000 to a car’s sticker price in 2025, a figure that will drop to around $5,000 in 2030 and about $3,000 in 2035
Spinnor said:
That is a hard one and I think I will agree with your assessment above, less than 10 percent adoption of self driving cars in 7 years.
Not to burst your bubble about the future popularity of autonomous cars, but:
https://www.fastcompany.com/3025722/will-you-ever-be-able-to-afford-a-self-driving-car said:
IHS predicts that annual sales between 2025 and 2035 will jump from 230,000 to 11.8 million. That’s about 9% of all the world’s auto sales in 2035. Seven million of those 11.8 million vehicles will rely on a mix of driver input and autonomous control, with the remaining 4.8 million vehicles relying entirely on computers to get around. Combined with vehicles from previous model years, IHS also forecasts that there will be 54 million autonomous vehicles on the road by 2035. When will sales of autonomous cars outnumber those of conventional cars? IHS expects this tipping point to occur by 2050. By then, IHS says the majority of vehicles sold and those in use are likely to be autonomous, with conventional vehicles becoming increasingly rare.
So we are talking 20-30 years instead of 7 years (and they are not all fully autonomous, most still requiring driver input).
 
  • #190
jack action said:
You are missing his point. If the self driving cars are above average but below the best of human drivers, why would the best drivers want to pay extra to be driven by a self driving car?

Then, once everybody realizes that, guess how many people will think they are among the best drivers ...

Yes, everyone thinks they are above average except me. It is her point by the way. Once they are better then the average driver there should be no problem with them being in the road. I will offer you the same bet I made to the Dr., In 7 years self driving cars will have a better driving record as a whole then humans. Take the bet?
 
  • Like
Likes russ_watters
  • #191
Yes and no, but I voted no.

If the cars are in bracketed lanes (like a train on a track), where they have to follow a certain path, then I'd feel safer with that.

But if the car has to make decisions in an "open" driving environment, then I wouldn't feel safe. The cars would need artificial vision (an area my friend is doing his Ph.D. work on) and there are so many kinks that have to be worked out for that to be safe. I don't believe A.I. vision could adequately identify all threats and non-threats properly and make correct decisions.
 
  • Like
Likes symbolipoint
  • #192
jack action said:
Not to burst your bubble about the future popularity of autonomous cars, but:

I agreed that the adoption will likely be slow for regular passenger cars. I wrote,

"That is a hard one and I think I will agree with your assessment above, less than 10 percent adoption of self driving cars in 7 years."
 
  • #193
kyphysics said:
there are so many kinks that have to be worked out for that to be safe

It helps that world wide thousands or is it 10's of thousands of engineers are working on the problem of making self driving cars safe.

A google search says I am probably way off,

"Moore, who previously spent eight years at Google and ran the company's Pittsburgh office, estimates that there are 1,000 to 2,000 people in the city working on autonomous driving." That is just Pittsburgh.

From, https://www.cnbc.com/2017/09/16/pit...s-200000-pay-packages-for-robotics-grads.html

From, https://www.google.com/search?q=how...ome..69i57.13855j0j8&sourceid=chrome&ie=UTF-8
 
  • #194
At the present time, since there are still very few self-driving cars (SDCs), not enough experience has yet happened to allow for currently hidden dangerous bugs to be found and fixed. Therefore, I feel very slightly less safe. As the number of SDCs grows, I think it will become increasingly more dangerous for a while until many of the hidden dangerous bugs are found and fixed. Then it will begin to become gradually safer, and eventually it will become safer than it is today.
 
  • #195
I got my driver's license, but I didn't dare to drive to the highway, especially when the traffic was heavy
 
  • #196
Qamerash said:
I got my driver's license, but I didn't dare to drive to the highway, especially when the traffic was heavy

Off topic, you are driving a car and you come to a stop at a T-intersection in the U.S.A. where we drive on the righthand side of the road. You want to make a left turn. You look left and right to make sure no traffic is coming. Your last look for oncoming traffic before entering the intersection should be to the,

1, left
2, right?

Why?
 
Last edited:
  • Like
Likes Stavros Kiri
  • #197
Spinnor said:
Off topic, you are driving a car and you come to a stop at a T-intersection in the U.S.A. where we drive on the righthand side of the road. You want to make a left turn. You look left and right to make sure no traffic is coming. Your last look for oncoming traffic before entering the intersection should be to the,

1, left
2, right?

Why?
Left [again], to minimize time ... (and avoid possible accident)
(Sometimes, I first look right, then left. Or 3 times [left (to make initial decision), right, left (again, for final decision)].)
+ after have entered the intersection and have passed safely left side, check right side again for last time, to avoid high speed incomers ...

A machine would have sensors for both directions at the same time, I assume, for better decision making and minimizing time, to avoid accidents at intersections etc.
 
Last edited:
  • #198
Qamerash said:
I got my driver's license, but I didn't dare to drive to the highway, especially when the traffic was heavy
Welcome to PF and to the driving world.
I would try to take the "risk" for the highway ... before/or considering once and for all to settle with a self-driving car. Highways are not that bad, if you have a safe vehicle, and they tend to be necessary ... . And in any case, improving driving is nessecary too.
 
  • #199
Stavros Kiri said:
(Sometimes, I first look right, then left. Or 3 times [left (to make initial decision), right, left (again, for final decision)].)
+ after have entered the intersection and have passed safely left side, check right side again for last time, to avoid high speed incomers ...

My head swivels a bit as well! Right, left, right, left,...

Look left last and live is my motto, well kind of. A driver side impact by another car will put your body at most risk. Now if you have a passenger you have to recalculate but I still think it makes sense to look left last. When my wife is in the car I always ask for help, While I make my last look left she is updating me if I can go, "good?, ...go, go". There are caveats to "Look left last and live" rule, if for example you can see very far to the left but to the right the road curves out of view and so may hide oncoming traffic then you would want to look right last.
 
  • Like
Likes Stavros Kiri
  • #200
Spinnor said:
My head swivels a bit as well! Right, left, right, left,...

Look left last and live is my motto, well kind of. A driver side impact by another car will put your body at most risk. Now if you have a passenger you have to recalculate but I still think it makes sense to look left last. When my wife is in the car I always ask for help, While I make my last look left she is updating me if I can go, "good?, ...go, go". There are caveats to "Look left last and live" rule, if for example you can see very far to the left but to the right the road curves out of view and so may hide oncoming traffic then you would want to look right last.
You should logically look left last all the time before engaging as it is the first lane you will engage in. Once engaged in the left lane, you should take a last look at the right lane before engaging in that lane. If there is no traffic in the left lane, it is easy to safely stop for any unexpected car coming in the right lane. If there is traffic in both lanes once engaged (mistakes happen), you usually should step on the throttle to get out of the way as fast as possible. People coming out at you can obliviously see you and slow down. If you stop in the middle of the road, people have to stop (as opposed to slow down) and if they don't hit you, they might be rear-ended. To do this kind of emergency maneuver, you need to have confidence in other drivers (starting by not assuming all other drivers are incompetent is a good start), otherwise you might freeze and do what you shouldn't.

In short, your last look is at traffic coming from the lane you are crossing before engaging in that lane.
 
Last edited:
  • Like
Likes Stavros Kiri
Back
Top