Do you feel safer with self-driving cars on the road?

Click For Summary
SUMMARY

The forum discussion centers on the safety perceptions of self-driving cars. Participants express skepticism about the current capabilities of AI in driving, emphasizing that human drivers often fail to anticipate complex situations, a skill that AI has yet to master. While some argue that self-driving cars could statistically reduce accidents, others highlight the limitations of current technology and the unpredictability of human behavior. The consensus leans towards a cautious optimism for future advancements, particularly with the expectation that widespread adoption will improve safety over time.

PREREQUISITES
  • Understanding of AI programming and its limitations in real-world scenarios.
  • Familiarity with the statistical safety records of self-driving cars versus human drivers.
  • Knowledge of human behavioral patterns in driving contexts.
  • Awareness of the technological advancements in self-driving systems and their current limitations.
NEXT STEPS
  • Research the latest advancements in AI algorithms for autonomous driving.
  • Explore statistical analyses comparing accident rates of self-driving cars and human drivers.
  • Investigate the impact of human behavior on driving safety and accident prevention.
  • Learn about the regulatory landscape affecting the deployment of self-driving vehicles.
USEFUL FOR

Individuals interested in automotive technology, AI developers, safety regulators, and anyone involved in the future of transportation and autonomous vehicle systems.

Do you feel safer with self-driving cars on the road?

  • Yes

    Votes: 31 41.3%
  • No

    Votes: 37 49.3%
  • No opinion

    Votes: 7 9.3%

  • Total voters
    75
  • #241
FactChecker said:
I should have known that Canada would take the issue of self driving cars on snow and ice very seriously:

Not much there to consider if the self driving cars can function in winter.
I think they have a goodly lot more testing to do for a confidence level to be impressive.
A 3d map? - complete updates minute to minute, hour to hour, day to day, month to month - the landscape can change.
The landscape maps most likely do not have mountains of pushed and dumped snow 15 feet high either.
Or the 2 to 4 foot high plowed bank by the side of the road.
Does, or will, a self driving car know how to rock itself out of a parking spot after sitting there overnight - the steering wheel and pedal less ones may have a problem. Just yesterday
Will it have to clean the snow of the hood, headlights, tail lights, or will that have to be the passengers responsibility.
Windshield washer fluid check.
Wiper blade freeze up - may not need these two, but passengers do like to gawk at all the other drivers.
I am sure? that the shoveling and pushing will be a thing of the past after a foot snowfall with a self driving.
 
Computer science news on Phys.org
  • #242
Will your self-driving car decide to kill you if its algorithms are forced to chose between, say, driving off a cliff or into a crowd of people?
 
  • #243
BWV said:
Will your self-driving car decide to kill you if its algorithms are forced to chose between, say, driving off a cliff or into a crowd of people?
Depends on the program
 
  • #244
Stavros Kiri said:
Depends on the program
The program called "Thelma & Louise" is especially bad that way.
 
  • Like
Likes   Reactions: Stavros Kiri
  • #245
BWV said:
Will your self-driving car decide to kill you if its algorithms are forced to chose between, say, driving off a cliff or into a crowd of people?
The best way to answer this question is answering this one:

While driving a car, would you decide to kill yourself if you are forced to chose between, say, driving off a cliff or into a crowd of people?
 
  • Like
Likes   Reactions: Stavros Kiri
  • #246
jack action said:
The best way to answer this question is answering this one:

While driving a car, would you decide to kill yourself if you are forced to chose between, say, driving off a cliff or into a crowd of people?

The law does not expect you to sacrifice yourself to save others, but it might require your car to

http://science.sciencemag.org/content/352/6293/1573
 
  • Like
Likes   Reactions: Stavros Kiri
  • #247
FactChecker said:
The program called "Thelma & Louise" is especially bad that way.
Here is another question that might question the option for manual override, that I spoke in favour earlier:
Will a human be allowed to manual override an autonomous vehicle to "their death"?

[As a first response, I can't see why not. (Suicide generally may be immoral but not illegal ...)
But what about ... to other people's death?]
 
  • #248
BWV said:
The law does not expect you to sacrifice yourself to save others, but it might require your car to

http://science.sciencemag.org/content/352/6293/1573
I don't think it will ever happen. The machine is there to make the decision for the driver (now only a passenger), thus if a human driver's reaction is deemed acceptable, the same decision made by AI 'working' for the passenger should be acceptable too.

It would be terrible if human lives were just reduced to probabilities and statistics, because then humans just become livestock. And that is why (from the abstract of your link) no ones want to be in a driverless vehicle that has not its passengers as a number one priority.

Imagine putting your child in a school bus. Could you accept the bus driver sacrificing your child because he might save other (more valuable) people? Now replace the school bus and its driver by a driverless bus. The moral dilemma faints quickly.
 
  • #250
jack action said:
I don't think it will ever happen. The machine is there to make the decision for the driver (now only a passenger), thus if a human driver's reaction is deemed acceptable, the same decision made by AI 'working' for the passenger should be acceptable too.

It would be terrible if human lives were just reduced to probabilities and statistics, because then humans just become livestock. And that is why (from the abstract of your link) no ones want to be in a driverless vehicle that has not its passengers as a number one priority.

Imagine putting your child in a school bus. Could you accept the bus driver sacrificing your child because he might save other (more valuable) people? Now replace the school bus and its driver by a driverless bus. The moral dilemma faints quickly.
The decision would be the choice of the driver, and how a decision made in seconds can play out.

No driver can, and neither could a self driving car, do the necessary calculations in the short time allocated.
Otherwise, there should be enough time to avoid and/or stop harming no one.

Then again, how often do these scenarios ever play out anyways.
But owing to the chance that it could, most drivers would probably try to avoid bus shelters, babies in carriages, bicycles, driving off a cliff, wedding parties, ramming into a building, or whatever. Accidents happen so split second that there is just not enough time to second guess maneuvers. In the end, the casualties may be the occupants of the car, or bystanders, some of both, or neither.

At one time I used to think, ah, that's a moral dilemma to sort out, on how to make the program make a moral decision, or the cost of a life, but not so much any more. If it could be done, ie give the program the responsibility of making moral decisions, question then becomes "Whose morals?" It quickly becomes a quagmire.

Best way to sort is out, as is done now with human drivers, is through the legal system, and payouts, if anyone is ever found negligent and/or responsible for the cause of the accident. Light 'black boxes' could become the norm for cars equipped as self driving as a means to provide evidence.
 
  • #251
256bits said:
No driver can, and neither could a self driving car, do the necessary calculations in the short time allocated.
That is why there is always a "fail-safe" maneuver which - in the case of humans - is to do what most likely will protect itself. It is hardwired, a reflex. If you use AI, you will have to program one of these maneuvers: In the event I can't decide what to do, what do I do?

For my part, when I face a moral dilemma, I can always count on The Simpsons to show me the way. And here is how The Simpsons believe AI should react in extreme cases:



Save yourself! :biggrin:
 
Last edited:
  • Like
Likes   Reactions: 256bits, FactChecker and nsaspook
  • #252
Another issue with driving in snow is the problem of staying in the correct lane. That is being worked on (see ).

But this approach requires that the lane signature be mapped ahead of time and available to the car. That doesn't seem very realistic to me. I think it would be easier if something was embedded in the pavement that indicated the lane.
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
3K
Replies
10
Views
5K
  • · Replies 28 ·
Replies
28
Views
5K
Replies
7
Views
6K
Replies
67
Views
16K
  • · Replies 19 ·
Replies
19
Views
12K
  • · Replies 2 ·
Replies
2
Views
4K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 17 ·
Replies
17
Views
4K