Anyone else a bit concerned with autonomized weapons?

  • Thread starter Thread starter dipole
  • Start date Start date
  • Tags Tags
    Bit
Click For Summary

Discussion Overview

The discussion centers around concerns regarding the potential use and implications of autonomous weapons, particularly robotic systems equipped with firearms. Participants explore the risks associated with such technologies in densely populated areas, comparing them to existing weapon systems like drones and discussing their potential for misuse in terrorism.

Discussion Character

  • Debate/contested
  • Exploratory
  • Technical explanation

Main Points Raised

  • Some participants express concern about the potential for autonomous robots to be used in mass shootings, questioning how quickly they could operate in urban environments before being stopped by authorities.
  • Others argue that existing drone technology, which can also deliver lethal force, should be considered equally or more concerning than autonomous robots.
  • A viewpoint is raised that the autonomy of the robots presents a greater risk compared to drones, which are still controlled by humans.
  • Some participants highlight the feasibility of constructing such robots at a lower cost compared to military drones, suggesting that this could lead to wider accessibility for malicious actors.
  • There is a discussion about the effectiveness of robots versus humans in executing attacks, with some arguing that robots could be more precise and less prone to error than human shooters.
  • Concerns are raised about the implications of robots being able to operate in environments like schools or office buildings, potentially increasing the scale of violence compared to traditional weapons.
  • Participants debate the motivations of terrorists, with some questioning why they would choose to use robots instead of simpler methods like truck bombs.
  • Others counter that the aim of terrorists is to maximize casualties, which could justify the use of more sophisticated autonomous weapons.
  • There are claims that police responses to autonomous threats would differ significantly from human threats, with some arguing that police would neutralize a robot threat quickly, while others challenge this assumption.
  • Some participants draw parallels between autonomous weapons and existing autonomous devices like landmines, discussing their effectiveness and ethical implications.
  • Concerns are raised about the lack of valid counterarguments to the risks posed by autonomous weapons, suggesting that the discussion is being prematurely shut down.

Areas of Agreement / Disagreement

Participants do not reach consensus on the level of threat posed by autonomous weapons compared to existing technologies. There are competing views on the feasibility and implications of such technologies, as well as differing opinions on the motivations behind their potential use in terrorism.

Contextual Notes

Some participants express uncertainty about the practicality and sophistication of autonomous weapons being realized in the near future, while others assert that technological advancements could lead to significant developments in this area. The discussion reflects a range of assumptions about the capabilities and motivations of both potential users of such technology and law enforcement responses.

dipole
Messages
553
Reaction score
149
Saw this video today on facebook.



I wonder how soon it will be before one of these is fitted with a machine gun. What if one of these was built or stolen for use in terrorism? In a densely populated city, how long could one of these roam freely gunning people down before police or military could stop it?

That might seem paranoid, but in an age where human beings are already gunning people down en mass on a fairly regular basis, it seems highly plausible that if this technology were to proliferate, they'll use machines to accomplish the same goals, but probably with much greater effects.
 
Computer science news on Phys.org
We already have drones that shoot missiles, why does that 4 legged machine scare you more?
 
  • Like
Likes   Reactions: bhobba, Grands and russ_watters
Greg Bernhardt said:
We already have drones that shoot missiles, why does that 4 legged machine scare you more?

I believe the difference is that this machine is largely autonomous, drones are still under human control.
 
Robots + guns could pose a danger to society. Clearly it must be the robot part of the equation that is the greatest danger.
 
  • Like
Likes   Reactions: Pythagorean and BillTre
dipole said:
I wonder how soon it will be before one of these is fitted with a machine gun. What if one of these was built or stolen for use in terrorism? In a densely populated city, how long could one of these roam freely gunning people down before police or military could stop it?
What is the difference between this machine roaming freely shooting people and someone like the Parkland shooter?
 
  • Like
Likes   Reactions: BillTre and russ_watters
Greg Bernhardt said:
We already have drones that shoot missiles, why does that 4 legged machine scare you more?

Those drones cost hundreds of millions of dollars, require some kind of airport or large space to take off from, can't be concealed or transported...

This thing could probably be build for a few thousand dollars (at least in a couple years) and can fit in the trunk of a car. There's a huge difference here.
 
dipole said:
Those drones cost hundreds of millions of dollars, require some kind of airport or large space to take off from, can't be concealed or transported...

This thing could probably be build for a few thousand dollars (at least in a couple years) and can fit in the trunk of a car. There's a huge difference here.
I bet that cost more than a few thousand dollars and something like that isn't even on shelves to buy. Why would someone take the time to buy it, program and arm it when they can just do the shooting themselves?
 
  • Like
Likes   Reactions: BillTre and russ_watters
Ygggdrasil said:
Robots + guns could pose a danger to society. Clearly it must be the robot part of the equation that is the greatest danger.

That's not the issue - don't drag this off topic. The point is that if humans+guns are a problem, then it should be obvious that robots+guns could pose an ever bigger problem.
 
dipole said:
The point is that if humans+guns are a problem, then it should be obvious that robots+guns could pose an ever bigger problem.
Eventually sure, but do you think this kind of sophistication and practicality is realistic in our lifetime? (outside of countries defense budgets)
 
  • #10
Greg Bernhardt said:
I bet that cost more than a few thousand dollars and something like that isn't even on shelves to buy. Why would someone take the time to buy it, program and arm it when they can just do the shooting themselves?

Because the aim of terrorists is to kill as many people as possible.

Your question is extremely naive... have you been following world events the past two decades or so? That's like asking "Why would terrorists learn to fly planes and crash them into buildings when they could just drive a car into a building?"

Come on... get with it.
 
  • #11
dipole said:
Because the aim of terrorists is to kill as many people as possible.
A truck bomb can kill hundreds and requires little sophistication. Why would they arm a robot dog?
 
  • Like
Likes   Reactions: HAYAO and BillTre
  • #12
You could argue that a land mine is a crude autonomous weapon.
 
  • Like
Likes   Reactions: Averagesupernova and russ_watters
  • #13
dipole said:
In a densely populated city, how long could one of these roam freely gunning people down before police or military could stop it?
Probably not very long. The police would likely just run a truck over it. The reason terrorist situations are so challenging is because there's a massive incentive to take the terrorist alive so that you can pump information out of him. With a robot, there's no such incentive. Just run it over with a truck and hand the remains to engineers for forensic info.
 
  • Like
Likes   Reactions: HAYAO, bhobba, BillTre and 1 other person
  • #14
A robot is mobile. A robot like that could wander the halls of an office building, school etc. and seek people out, and could cover a much longer range than a single bomb can. A bomb, in most cases, is actually quite localized.

I could easily ask you the question of why any human at all ever went on a shooting rampage when they could have just built a bomb. The fact is shooting rampages are a real phenomenon, and in a few years robots like this could probably do them better than any human.

A human can be stopped with a single bullet, a robot like that could be armoured to render it almost immune to small arms fire. A humans aim is imprecise and effected by stress and fear. A robot can aim perfectly every time, putting nearly every bullet it fires into the head of a human.

I'm not saying this is the end of the world, but I think it's worth considering where things may be going.
 
  • #15
TeethWhitener said:
Probably not very long. The police would likely just run a truck over it. The reason terrorist situations are so challenging is because there's a massive incentive to take the terrorist alive so that you can pump information out of him. With a robot, there's no such incentive. Just run it over with a truck and hand the remains to engineers for forensic info.

Are you kidding me... don't post false information you can't back up. You're claiming that if a terrorist is killing people, police will just let him be until they can figure out a way to take him alive?

The police try to end situations as quickly as possible. Often they can't do that without putting their own lives at risk, which is what prolongs a situation. You assume a robot like this will stupidly stand in the middle of the road while a truck of police officers drives towards a live machine gun. How are police going to drive a truck into it if it's moving between buildings? What police officer is willingly going to to drive towards a firing machine gun, bullet proof glass or not?
 
  • #16
dipole said:
You're claiming that if a terrorist is killing people, police will just let him be until they can figure out a way to take him alive?
No, police attempt to neutralize the threat as quickly as possible. But there is a massive incentive to capture these guys alive.

dipole said:
What police officer is willingly going to to drive towards a firing machine gun, bullet proof glass or not?
In a world...where we have robot shooter dogs but not remote-controlled cars...
 
  • #17
dipole said:
I could easily ask you the question of why any human at all ever went on a shooting rampage when they could have just built a bomb.
At least in the US, guns are much easier to obtain than even the most rudimentary explosive chemicals.
 
  • Like
Likes   Reactions: bhobba, Stephen Tashi, BillTre and 1 other person
  • #18
CWatters said:
You could argue that a land mine is a crude autonomous weapon.

Yes, and landmines kill thousands of people every year and are extremely effective at their intended use - denial of area.

What is your point?
 
  • #19
TeethWhitener said:
At least in the US, guns are much easier to obtain than even the most rudimentary explosive chemicals.

True, so the guns already exist... that video shows two working examples of robots that already exist, it's only one small step to put them together.

Despite how quick everyone has been to try and shut down any discussion of this topic, no one has made a single valid point that would suggest this technology doesn't come with some special concerns.
 
  • #20
dipole said:
Despite how quick everyone has been to try and shut down any discussion of this topic, no one has made a single valid point that would suggest this technology doesn't come with some special concerns.

Some moderate disagreement does not equal shutting down discussion. Please reel in your hostility.
 
  • Like
Likes   Reactions: Mark44
  • #21
Greg Bernhardt said:
Eventually sure, but do you think this kind of sophistication and practicality is realistic in our lifetime? (outside of countries defense budgets)

There are a lot of issues here lurking under "defense budgets". Suppose "separatists" in an eastern european country got ahold of this technology during civil strife. (Whether separatists did this or it was a front for a nearby larger neighbor I'll leave with you.) I could be referring to something ongoing right now, or I could be referring to the Balkans in the 90s or something else...

Another issue: suppose you have a wobbly dictatorship and civil strife. Some dictators clamp down on protesters by ordering the army to shoot them down. Sometimes this works, sometimes it doesn't -- a lot of soldiers have qualms about slaughtering their own people. Machines (Terminators?) have no such qualms and simply execute orders. There are similar issues during coups -- failed and successful ones.

As usual, machines allow you to scale things in a way that humans don't. This should be very spooky stuff
 
Last edited:
  • Like
Likes   Reactions: Monsterboy, AlexCaledin and Greg Bernhardt
  • #22
Well it's a little frustrating when a simple suggestion, which is founded on a set factual premises (1. guns exists, 2. people use guns to kill other people all en mass all the time 3. people are building autonomous robots which can navigate complex environments very well 4. such technology will likely be easy to scale), is met with a flurry of poorly thought out responses and rebuttles which could easily be disregarded with a few moments consideration...

It's not like I'm the only person to ever suggest this idea. Plenty of scholars including people many on this forum would claim to idolize have raised the same issue.
 
  • #23
I think it's important that we think carefully about what we ought to be concerned with. When @dipole raise the issue of an autonomous robot that can move about freely and worry what happens when a gun is attached -- such a robot will not necessarily have much utility in the situation of modern warfare, and terrorists are unlikely (at least in the immediate future) are unlikely to obtain these given both the cost involved in building and/or acquiring such technologies.

What we should be more concerned with are developments in sophisticated AI systems that are capable of selecting and engaging militarily with targets without human intervention -- so called lethal autonomous weapons systems (LAWS). See the following links where Berkeley computer science professor Stuart Russell discusses the risks and ethical concerns of such LAWS:

http://news.berkeley.edu/2015/05/28/automated-killing-machines/

https://www.nature.com/news/robotic...intelligence-1.17611?WT.ec_id=NATURE-20150528
 
  • Like
Likes   Reactions: BillTre
  • #24
Sentry robots are already available with autonomous capability, or semi-autonomous depending upon where the "human" is placed within the loop.
https://en.wikipedia.org/wiki/Samsung_SGR-A1

Next step, as @dipole has suggested is to provide mobility, as in post #1, along with the sentry technology, tweek it, and it can become a more aggressive component of a military, para-military, or heaven forbid, "a subversive organization", a promotion from simple defensive capabilities.
Presently, though, a merger to produce a killer robot has insufficiencies, as there is no specification for a particular target, just a random selection of targets within range, which may or may not have the desired effect. Give it time.
 
  • #25
dipole said:
Yes, and landmines kill thousands of people every year and are extremely effective at their intended use - denial of area.

What is your point?

My point is that that autonomous weapons effectively exist already. It's just a matter of degree.

Incidentally a lot of the work being done for self driving cars will be directly applicable to autonomous weapons. For example terrain navigation and people detection and avoidance. We're also already discussing some of the moral issues that self driving cars may need to have. So I disagree with the view that autonomous weapons are a long way off.
 
  • Like
Likes   Reactions: Monsterboy and russ_watters
  • #26
The Future of Life Institute and Stuart Russell professor of Computer Science at Berkeley produced a video of the possibilities of autonomous weapons in this case a microbot swarms.

 
  • Like
Likes   Reactions: StatGuy2000 and russ_watters
  • #27
Borek said:
I believe the difference is that this machine is largely autonomous, drones are still under human control.
I agree that's a difference, but why does it make the autonomous worse?
 
  • #28
russ_watters said:
I agree that's a difference, but why does it make the autonomous worse?
I suppose only once AI surpasses human intelligence. Essentially a Terminator.
 
  • Like
Likes   Reactions: russ_watters
  • #29
dipole said:
Those drones cost hundreds of millions of dollars, require some kind of airport or large space to take off from, can't be concealed or transported...
Please be more reasonable. The *most* expensive military drones cost a few tens of millions, not hundreds of millions (Google tells me a Reaper costs $16 million). Serious attacks by amateurs are already happening and probably cost only a few thousand dollars:
https://globalriskinsights.com/2018/01/swarm-drone-attack-syria-uav/

To me, this is a much more serious threat.
This thing could probably be build for a few thousand dollars (at least in a couple years) and can fit in the trunk of a car. There's a huge difference here.
Yes, there is a huge difference and you're on the wrong side of it! To be frank, I think you're losing sight of just how difficult a time we're having making terrestrial robots. We are *not* close to Terminator style robots. We *already have* low-cost, autonomous drone attacks.
True, so the guns already exist... that video shows two working examples of robots that already exist, it's only one small step to put them together.
Ehem. A working robot that took 15 seconds, with some struggle, to open a door. It is far from ready to attach a gun to it to become a killing machine. If you're running from it you'd be out the next door before it finishes opening that one!

And people are already shooting guns from drones too:

 
Last edited:
  • #30
Greg Bernhardt said:
I suppose only once AI surpasses human intelligence

Why does it have to exceed human intelligence?
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 12 ·
Replies
12
Views
4K
  • · Replies 84 ·
3
Replies
84
Views
9K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 49 ·
2
Replies
49
Views
7K
  • · Replies 96 ·
4
Replies
96
Views
12K
  • · Replies 9 ·
Replies
9
Views
4K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 1 ·
Replies
1
Views
10K