Anyone else a bit concerned with autonomized weapons?

  • Thread starter Thread starter dipole
  • Start date Start date
  • Tags Tags
    Bit
AI Thread Summary
Concerns about autonomous weapons are rising, particularly regarding their potential use in terrorist attacks and mass shootings in urban areas. The discussion highlights the difference between human-controlled drones and autonomous robots, emphasizing that the latter could operate with greater efficiency and lethality. Participants express fears about the ease of manufacturing such robots, which could be concealed and deployed more easily than traditional military drones. The conversation also touches on the ethical implications of lethal autonomous weapons systems (LAWS) and their potential to escalate violence without human oversight. Overall, the thread underscores the urgent need to consider the implications of advancing robotic and AI technologies in warfare and public safety.
dipole
Messages
553
Reaction score
151
Saw this video today on facebook.



I wonder how soon it will be before one of these is fitted with a machine gun. What if one of these was built or stolen for use in terrorism? In a densely populated city, how long could one of these roam freely gunning people down before police or military could stop it?

That might seem paranoid, but in an age where human beings are already gunning people down en mass on a fairly regular basis, it seems highly plausible that if this technology were to proliferate, they'll use machines to accomplish the same goals, but probably with much greater effects.
 
Physics news on Phys.org
We already have drones that shoot missiles, why does that 4 legged machine scare you more?
 
  • Like
Likes bhobba, Grands and russ_watters
Greg Bernhardt said:
We already have drones that shoot missiles, why does that 4 legged machine scare you more?

I believe the difference is that this machine is largely autonomous, drones are still under human control.
 
Robots + guns could pose a danger to society. Clearly it must be the robot part of the equation that is the greatest danger.
 
  • Like
Likes Pythagorean and BillTre
dipole said:
I wonder how soon it will be before one of these is fitted with a machine gun. What if one of these was built or stolen for use in terrorism? In a densely populated city, how long could one of these roam freely gunning people down before police or military could stop it?
What is the difference between this machine roaming freely shooting people and someone like the Parkland shooter?
 
  • Like
Likes BillTre and russ_watters
Greg Bernhardt said:
We already have drones that shoot missiles, why does that 4 legged machine scare you more?

Those drones cost hundreds of millions of dollars, require some kind of airport or large space to take off from, can't be concealed or transported...

This thing could probably be build for a few thousand dollars (at least in a couple years) and can fit in the trunk of a car. There's a huge difference here.
 
dipole said:
Those drones cost hundreds of millions of dollars, require some kind of airport or large space to take off from, can't be concealed or transported...

This thing could probably be build for a few thousand dollars (at least in a couple years) and can fit in the trunk of a car. There's a huge difference here.
I bet that cost more than a few thousand dollars and something like that isn't even on shelves to buy. Why would someone take the time to buy it, program and arm it when they can just do the shooting themselves?
 
  • Like
Likes BillTre and russ_watters
Ygggdrasil said:
Robots + guns could pose a danger to society. Clearly it must be the robot part of the equation that is the greatest danger.

That's not the issue - don't drag this off topic. The point is that if humans+guns are a problem, then it should be obvious that robots+guns could pose an ever bigger problem.
 
dipole said:
The point is that if humans+guns are a problem, then it should be obvious that robots+guns could pose an ever bigger problem.
Eventually sure, but do you think this kind of sophistication and practicality is realistic in our lifetime? (outside of countries defense budgets)
 
  • #10
Greg Bernhardt said:
I bet that cost more than a few thousand dollars and something like that isn't even on shelves to buy. Why would someone take the time to buy it, program and arm it when they can just do the shooting themselves?

Because the aim of terrorists is to kill as many people as possible.

Your question is extremely naive... have you been following world events the past two decades or so? That's like asking "Why would terrorists learn to fly planes and crash them into buildings when they could just drive a car into a building?"

Come on... get with it.
 
  • #11
dipole said:
Because the aim of terrorists is to kill as many people as possible.
A truck bomb can kill hundreds and requires little sophistication. Why would they arm a robot dog?
 
  • Like
Likes HAYAO and BillTre
  • #12
You could argue that a land mine is a crude autonomous weapon.
 
  • Like
Likes Averagesupernova and russ_watters
  • #13
dipole said:
In a densely populated city, how long could one of these roam freely gunning people down before police or military could stop it?
Probably not very long. The police would likely just run a truck over it. The reason terrorist situations are so challenging is because there's a massive incentive to take the terrorist alive so that you can pump information out of him. With a robot, there's no such incentive. Just run it over with a truck and hand the remains to engineers for forensic info.
 
  • Like
Likes HAYAO, bhobba, BillTre and 1 other person
  • #14
A robot is mobile. A robot like that could wander the halls of an office building, school etc. and seek people out, and could cover a much longer range than a single bomb can. A bomb, in most cases, is actually quite localized.

I could easily ask you the question of why any human at all ever went on a shooting rampage when they could have just built a bomb. The fact is shooting rampages are a real phenomenon, and in a few years robots like this could probably do them better than any human.

A human can be stopped with a single bullet, a robot like that could be armoured to render it almost immune to small arms fire. A humans aim is imprecise and effected by stress and fear. A robot can aim perfectly every time, putting nearly every bullet it fires into the head of a human.

I'm not saying this is the end of the world, but I think it's worth considering where things may be going.
 
  • #15
TeethWhitener said:
Probably not very long. The police would likely just run a truck over it. The reason terrorist situations are so challenging is because there's a massive incentive to take the terrorist alive so that you can pump information out of him. With a robot, there's no such incentive. Just run it over with a truck and hand the remains to engineers for forensic info.

Are you kidding me... don't post false information you can't back up. You're claiming that if a terrorist is killing people, police will just let him be until they can figure out a way to take him alive?

The police try to end situations as quickly as possible. Often they can't do that without putting their own lives at risk, which is what prolongs a situation. You assume a robot like this will stupidly stand in the middle of the road while a truck of police officers drives towards a live machine gun. How are police going to drive a truck into it if it's moving between buildings? What police officer is willingly going to to drive towards a firing machine gun, bullet proof glass or not?
 
  • #16
dipole said:
You're claiming that if a terrorist is killing people, police will just let him be until they can figure out a way to take him alive?
No, police attempt to neutralize the threat as quickly as possible. But there is a massive incentive to capture these guys alive.

dipole said:
What police officer is willingly going to to drive towards a firing machine gun, bullet proof glass or not?
In a world...where we have robot shooter dogs but not remote-controlled cars...
 
  • #17
dipole said:
I could easily ask you the question of why any human at all ever went on a shooting rampage when they could have just built a bomb.
At least in the US, guns are much easier to obtain than even the most rudimentary explosive chemicals.
 
  • Like
Likes bhobba, Stephen Tashi, BillTre and 1 other person
  • #18
CWatters said:
You could argue that a land mine is a crude autonomous weapon.

Yes, and landmines kill thousands of people every year and are extremely effective at their intended use - denial of area.

What is your point?
 
  • #19
TeethWhitener said:
At least in the US, guns are much easier to obtain than even the most rudimentary explosive chemicals.

True, so the guns already exist... that video shows two working examples of robots that already exist, it's only one small step to put them together.

Despite how quick everyone has been to try and shut down any discussion of this topic, no one has made a single valid point that would suggest this technology doesn't come with some special concerns.
 
  • #20
dipole said:
Despite how quick everyone has been to try and shut down any discussion of this topic, no one has made a single valid point that would suggest this technology doesn't come with some special concerns.

Some moderate disagreement does not equal shutting down discussion. Please reel in your hostility.
 
  • Like
Likes Mark44
  • #21
Greg Bernhardt said:
Eventually sure, but do you think this kind of sophistication and practicality is realistic in our lifetime? (outside of countries defense budgets)

There are a lot of issues here lurking under "defense budgets". Suppose "separatists" in an eastern european country got ahold of this technology during civil strife. (Whether separatists did this or it was a front for a nearby larger neighbor I'll leave with you.) I could be referring to something ongoing right now, or I could be referring to the Balkans in the 90s or something else...

Another issue: suppose you have a wobbly dictatorship and civil strife. Some dictators clamp down on protesters by ordering the army to shoot them down. Sometimes this works, sometimes it doesn't -- a lot of soldiers have qualms about slaughtering their own people. Machines (Terminators?) have no such qualms and simply execute orders. There are similar issues during coups -- failed and successful ones.

As usual, machines allow you to scale things in a way that humans don't. This should be very spooky stuff
 
Last edited:
  • Like
Likes Monsterboy, AlexCaledin and Greg Bernhardt
  • #22
Well it's a little frustrating when a simple suggestion, which is founded on a set factual premises (1. guns exists, 2. people use guns to kill other people all en mass all the time 3. people are building autonomous robots which can navigate complex environments very well 4. such technology will likely be easy to scale), is met with a flurry of poorly thought out responses and rebuttles which could easily be disregarded with a few moments consideration...

It's not like I'm the only person to ever suggest this idea. Plenty of scholars including people many on this forum would claim to idolize have raised the same issue.
 
  • #23
I think it's important that we think carefully about what we ought to be concerned with. When @dipole raise the issue of an autonomous robot that can move about freely and worry what happens when a gun is attached -- such a robot will not necessarily have much utility in the situation of modern warfare, and terrorists are unlikely (at least in the immediate future) are unlikely to obtain these given both the cost involved in building and/or acquiring such technologies.

What we should be more concerned with are developments in sophisticated AI systems that are capable of selecting and engaging militarily with targets without human intervention -- so called lethal autonomous weapons systems (LAWS). See the following links where Berkeley computer science professor Stuart Russell discusses the risks and ethical concerns of such LAWS:

http://news.berkeley.edu/2015/05/28/automated-killing-machines/

https://www.nature.com/news/robotic...intelligence-1.17611?WT.ec_id=NATURE-20150528
 
  • Like
Likes BillTre
  • #24
Sentry robots are already available with autonomous capability, or semi-autonomous depending upon where the "human" is placed within the loop.
https://en.wikipedia.org/wiki/Samsung_SGR-A1

Next step, as @dipole has suggested is to provide mobility, as in post #1, along with the sentry technology, tweek it, and it can become a more aggressive component of a military, para-military, or heaven forbid, "a subversive organization", a promotion from simple defensive capabilities.
Presently, though, a merger to produce a killer robot has insufficiencies, as there is no specification for a particular target, just a random selection of targets within range, which may or may not have the desired effect. Give it time.
 
  • #25
dipole said:
Yes, and landmines kill thousands of people every year and are extremely effective at their intended use - denial of area.

What is your point?

My point is that that autonomous weapons effectively exist already. It's just a matter of degree.

Incidentally a lot of the work being done for self driving cars will be directly applicable to autonomous weapons. For example terrain navigation and people detection and avoidance. We're also already discussing some of the moral issues that self driving cars may need to have. So I disagree with the view that autonomous weapons are a long way off.
 
  • Like
Likes Monsterboy and russ_watters
  • #26
The Future of Life Institute and Stuart Russell professor of Computer Science at Berkeley produced a video of the possibilities of autonomous weapons in this case a microbot swarms.

 
  • Like
Likes StatGuy2000 and russ_watters
  • #27
Borek said:
I believe the difference is that this machine is largely autonomous, drones are still under human control.
I agree that's a difference, but why does it make the autonomous worse?
 
  • #28
russ_watters said:
I agree that's a difference, but why does it make the autonomous worse?
I suppose only once AI surpasses human intelligence. Essentially a Terminator.
 
  • Like
Likes russ_watters
  • #29
dipole said:
Those drones cost hundreds of millions of dollars, require some kind of airport or large space to take off from, can't be concealed or transported...
Please be more reasonable. The *most* expensive military drones cost a few tens of millions, not hundreds of millions (Google tells me a Reaper costs $16 million). Serious attacks by amateurs are already happening and probably cost only a few thousand dollars:
https://globalriskinsights.com/2018/01/swarm-drone-attack-syria-uav/

To me, this is a much more serious threat.
This thing could probably be build for a few thousand dollars (at least in a couple years) and can fit in the trunk of a car. There's a huge difference here.
Yes, there is a huge difference and you're on the wrong side of it! To be frank, I think you're losing sight of just how difficult a time we're having making terrestrial robots. We are *not* close to Terminator style robots. We *already have* low-cost, autonomous drone attacks.
True, so the guns already exist... that video shows two working examples of robots that already exist, it's only one small step to put them together.
Ehem. A working robot that took 15 seconds, with some struggle, to open a door. It is far from ready to attach a gun to it to become a killing machine. If you're running from it you'd be out the next door before it finishes opening that one!

And people are already shooting guns from drones too:

 
Last edited:
  • #30
Greg Bernhardt said:
I suppose only once AI surpasses human intelligence

Why does it have to exceed human intelligence?
 
  • #31
gleem said:
Why does it have to exceed human intelligence?
Then you lose the advantage of using the robot. If you are smarter, just do the shooting yourself. The Parkland shooter almost got away because of his human intelligence. I suppose it comes down to goals. If the goal is pure killing numbers, I'd still think there are better, cheaper and less sophisticated ways of killing mass people than using a robot.
 
  • #32
Greg Bernhardt said:
Then you lose the advantage of using the robot.

What if the robot is suicidal? It just has to get to its target.
 
  • #33
gleem said:
What if the robot is suicidal? It just has to get to its target.
If the entire plan is to just bull rush the target then I agree there is no difference and a robot is a better choice since it can be replaced.
 
  • #34
256bits said:
Sentry robots are already available with autonomous capability, or semi-autonomous depending upon where the "human" is placed within the loop.
https://en.wikipedia.org/wiki/Samsung_SGR-A1
That's a good point: the Navy has been using similar systems (Phalanx CIWS) for just short of 40 years. But you might ask yourself why the Las Vegas shooter didn't use one.
Next step, as @dipole has suggested is to provide mobility...
Yes, but that's a really big next step.

I also think @dipole should put some thought into how something like this would/could actually be deployed. The SGR-A1 weighs 258 lb and doesn't include a robot. A Terminator-style mobile version would probably weigh 1000 lb...

...though you could mount one on a self-driving car. Maybe this robot dog idea came up because we had a school shooting a couple of days ago, but I really don't see the advantage of it. It's just too cumbersome and that's an inherent problem, no matter how good it is.
 
  • Like
Likes nsaspook
  • #35
russ_watters said:
I agree that's a difference, but why does it make the autonomous worse?

I don't think it does, but on the psychological side most people prefer to think/feel they are in control (no matter what the logic/statistic says).

OTOH, there is always Frederic Brown's http://www.roma1.infn.it/~anzel/answer.html :wink:
 
  • Like
Likes Monsterboy and russ_watters
  • #36
gleem said:
The Future of Life Institute and Stuart Russell professor of Computer Science at Berkeley produced a video of the possibilities of autonomous weapons in this case a microbot swarms.


Well that's just terrifying.
 
  • Like
Likes bhobba
  • #37
I posted a fictitious video in post # 26 depicted the use of microbot swarms for useful ? and nefarious purposes to eliminate undesirable elements ( my characterization for those who for whatever reason are identified for elimination). I call you attention to this DARPA website discussing its research project OFFSET, OFFensive Swarm Effective Tactics. As you can see we are on our way to developing that which was depicted in the video.
 
  • #38
256bits said:
Presently, though, a merger to produce a killer robot has insufficiencies, as there is no specification for a particular target, just a random selection of targets within range, which may or may not have the desired effect. Give it time.

The military can put a robot like that in the middle of an area that you know is swarming with enemy troops and no hostages to kill.
 
  • #39
dipole said:
I wonder how soon it will be before one of these is fitted with a machine gun.
If the R&D of such a thing is complete and operating it costs less than soldiers who do the same thing then it is definitely in the future. Unless countries sign some kind of treaty like the ones dealing with chemical and biological weapons to not to combine autonomous robots and guns.
 
Last edited:
  • #40
russ_watters said:
Please be more reasonable. The *most* expensive military drones cost a few tens of millions, not hundreds of millions (Google tells me a Reaper costs $16 million). Serious attacks by amateurs are already happening and probably cost only a few thousand dollars:
https://globalriskinsights.com/2018/01/swarm-drone-attack-syria-uav/

To me, this is a much more serious threat.

Yes, there is a huge difference and you're on the wrong side of it! To be frank, I think you're losing sight of just how difficult a time we're having making terrestrial robots. We are *not* close to Terminator style robots. We *already have* low-cost, autonomous drone attacks.

Ehem. A working robot that took 15 seconds, with some struggle, to open a door. It is far from ready to attach a gun to it to become a killing machine. If you're running from it you'd be out the next door before it finishes opening that one!

And people are already shooting guns from drones too:



I never said this style of robotics is somehow a worse threat than drones. Yes, current modern day drone technology already poses a big threat. Pretty much anybody could buy a drone for a few hundred dollars, rig it with some kind of explosives, and fly it kamikaze style into a target and probably get away with it. The main limitation with small drones is, as far as I know, they can't support large payloads. They also can't easily penetrate buildings or navigate dense environments (one bump into a wall or tree and they essentially crash).

This ground-based quadruped locomotion is very robust. We've all seen the videos of BD trying to kick these things over, and how easily they recover. I worry about a robot with animal-level agility that can support large payloads (ammo, armour, weaponry), and can autonomously navigate environments that normally only people can, but unlike people would require military-grade weapons to stop.

Yes, it has a ways to go before it can actually do things like run around a building, but given the progress over the past five years or so (just look at the progression on Boston Dynamic's youtube page), I don't see any reason that won't be achieved in the near future.
 
  • #41
I'm more worried about the people that have this technology
 
  • Like
Likes russ_watters
  • #42
dipole said:
The main limitation with small drones is, as far as I know, they can't support large payloads. They also can't easily penetrate buildings or navigate dense environments (one bump into a wall or tree and they essentially crash).

Small less than 1m in diameter, 12 Kg payloads how much does one need, 30 minute flight times speeds up to 44 mph. and collision avoidance all right now also self mapping their environment no need of GPS. How about tracking cell phones. Most building have windows and drones can work in packs if necessary for complex operations. With possible drone threats we may not see drone delivery like Amazon or UPS wants.

Object avoidance drone video


Maybe the technology is not yet perfect but it is getting close and It's only a matter of time before all these elements are put together in various perfect mission specific drones. They can wait in ambush until their target appears or wait until one uses their cell phone, If they know where you will be what can stop them?
.
.Lethal Autonomous Weapon Systems (LAWS) is considered the third weapons revolution proceeded by gunpowder and nuclear weapons.

Monsterboy said:
Unless countries sign some kind of treaty like the ones dealing with chemical and biological weapons to not to combine autonomous robots and guns.

Apparently there is a UN conference on Certain Conventional Weapons that is looking into the possibility of drafting a treaty to outlaw LAWS. For a reveiw of a meeting in the Spring 2016 and a review of some then currently available capabilities and programs both government and civilian working on autonomous drones see the somewhat lengthy report https://www.buzzfeed.com/sarahatopo...-killer-robots?utm_term=.qrK829mxr#.queWXvKzo. and remember this was two years ago.

Grands said:
I'm more worried about the people that have this technolog

Basically everybody from the US to China, Russia, to Israel and the UK and in 2016 five more.
 
  • #43
Greg Bernhardt said:
I suppose only once AI surpasses human intelligence. Essentially a Terminator.
I don't think so, you need a robot which is much more agile and quicker than a human in the terrain and very fast and accurate in it's shooting. You don't need human level general intelligence.
 
  • #44
gleem said:
Basically everybody from the US to China, Russia, to Israel and the UK and in 2016 five more.
I was speaking about the people that run the country that have that weapons
 
  • #45
I say most of the technology concerned people when it was invented by having a potential to be used as a weapon. So this is nothing new.
 
  • #46
Grands said:
I was speaking about the people that run the country that have that weapons

I worded my response poorly and was thinking of the leadership and their advisers of those countries.
 
  • #47
HAYAO said:
I say most of the technology concerned people when it was invented by having a potential to be used as a weapon. So this is nothing new.
I would say that most of the technology was invented for military use and then transformed in civil use.
 
  • #48
gleem said:
Lethal Autonomous Weapon Systems (LAWS) is considered the third weapons revolution proceeded by gunpowder and nuclear weapons.
...
Apparently there is a UN conference on Certain Conventional Weapons that is looking into the possibility of drafting a treaty to outlaw LAWS. For a reveiw of a meeting in the Spring 2016 and a review of some then currently available capabilities and programs both government and civilian working on autonomous drones see the somewhat lengthy report https://www.buzzfeed.com/sarahatopo...-killer-robots?utm_term=.qrK829mxr#.queWXvKzo. and remember this was two years ago.
Admittedly I only read part of the article since it is long, but I'm not seeing the logic here. How is LAWS logically different - and worse - than a land mine that it represents either a revolution or a threat worthy of being banned?

Weapons have been banned in the past largely due to excessive cruelty (chemical weapons, phosophorous), not for being good at their jobs. Or, more commonly, their use has been restricted in obvious ways: you can't use them where they put civilians at risk.

Technology in warfare over the past 50 years has, primarily, brought us one thing: war is safer. Fewer people are dying, on both sides of conflicts. Heck, I can forsee a future where we send robots to fight other robots and humans aren't even put at risk. Isn't that a good thing, not a bad thing?

Simply: why is this bad and can anyone explain why it should be banned?

Note: this question applies to war only, not terrorism.
 
  • #49
russ_watters said:
Admittedly I only read part of the article since it is long, but I'm not seeing the logic here. How is LAWS logically different - and worse - than a land mine that it represents either a revolution or a threat worthy of being banned?

Weapons have been banned in the past largely due to excessive cruelty (chemical weapons, phosophorous). Or, more commonly, their use has been restricted in obvious ways: you can't use them where they put civilians at risk.

Technology in warfare over the past 50 years has, primarily, brought us one thing: war is safer. Fewer people are dying, on both sides of conflicts. Heck, I can forsee a future where we send robots to fight other robots and humans aren't even put at risk. Isn't that a good thing, not a bad thing?

Simply: why is this bad and can anyone explain why it should be banned?

Note: this question applies to war only, not terrorism.

In response to your questions above, let me address the following:

1. As to what is the difference between LAWS and other weapon technologies (e.g. land mines) -- from what I've read (particularly from Stuart Russell, who has written extensively on this topic) is the issue of whether we as humans want to have questions involving life and death decisions to be made automatically by machines, as opposed to having humans make the ultimate decisions. If we can articulate explicitly what we want or what outcomes we want, and thus explicitly have machines learn our desires, then there would be no problems.

The crux is -- we as humans don't always know what we want, nor are we necessarily have always been good at explicitly outlining what we want in such a way that doesn't lead to misunderstandings or unforeseen consequences. If we transfer decisions about life and death matters over to machines, can be certain that the decisions be made will be in the best interest of humanity? Russell in a number of Youtube videos (search under "Stuart Russell AI") makes the analogy of the genie in the bottle, where the genie grants the 3 wishes, and the human wants to have the last wish be the ability to undo the previous 2 wishes.

2. You state that technology in warfare over the past 50 years has made war safer. But can the relatively lack of conflict really be laid primarily in improvements in technology? Yes, the fear of nuclear annihilation has made some nations more reluctant in the use of say, nuclear weapons, but that is hardly what one would call a positive example of technology leading to lower mortality in conflict. Also, one can argue that international institutions that were built after World War II (the United Nations, including the Security Council, and the conflict resolution mechanisms built upon them) as well as the prominence of the United States as the pre-eminent superpower and "world's police" might have played a much more important role in that fewer people are dying.

3. Finally, circling back to point #1, you point out a future where robots fight other robots, and humans aren't put at risk. Can we be really certain that robots can understand not to put humans at risk? After all, if a robot is programmed to neutralize any threat to the nation, why not, say, kill every single human in the opposing side? This is pertinent because in essence robots could potentially make decision about life and death decisions without direct human input -- can or should we trust the robots to make such decisions for us?
 
  • #50
StatGuy2000 said:
1. As to what is the difference between LAWS and other weapon technologies (e.g. land mines) -- from what I've read (particularly from Stuart Russell, who has written extensively on this topic) is the issue of whether we as humans want to have questions involving life and death decisions to be made automatically by machines, as opposed to having humans make the ultimate decisions. If we can articulate explicitly what we want or what outcomes we want, and thus explicitly have machines learn our desires, then there would be no problems.
Perhaps I should have omitted the first few words of the question: I'm aware that LAWS is different from a land mine in that LAWS can weigh facts and make decisions whereas a land mine is stimulus-response only. My question is: How is this worse?

It may *feel* unsettling to think about a weapon that makes decisions, and I suppose people can pass laws/make rules based on whatever they want, but I would hope we would pass laws based on real risk and logic, not feelings alone. Otherwise, we may pass laws that make things worse instead of better.

Plainly:
A robot that makes poor decisions about what to kill is still better than a land mine that kills everything it comes in contact with.
2. You state that technology in warfare over the past 50 years has made war safer. But can the relatively lack of conflict really be laid primarily in improvements in technology?
You responded to something other than what I said, even after correctly paraphrasing it:

1. Safer war.
2. Less war.

Both are true, but I was referring to the first:
*Actual* wars are themselves less deadly for both sides when more advanced targeting technology is used.

The fact that wars are safer due to technology can be seen in the casualty rates of recent wars. Drilling down into specific examples, probably the biggest example is the use of precision guided bombs and cruise missiles instead of "dumb" bombs. These result in fewer bombing missions, which make war safer for the pilots and more accurate bomb strikes, which make war safer for civilians in the war zone. No longer do you have to level an entire city block to destroy one building.

Now it may also be true that there are fewer wars in part because of technology, but that isn't what I was after because it is a secondary effect.
3. Finally, circling back to point #1, you point out a future where robots fight other robots, and humans aren't put at risk. Can we be really certain that robots can understand not to put humans at risk? After all, if a robot is programmed to neutralize any threat to the nation, why not, say, kill every single human in the opposing side?
If there are no humans in the warzone, the robots cannot kill any humans. We're already doing our half: our drone pilots are often halfway around the world from the battles they are fighting in. There is no possibility of them being killed in the battles they are fighting. We are not far from drone vs drone combat and the next step would be robot vs robot combat.

We do already have that in some sense, with LAWS type weapons on ships attacking other autonomous weapons such as cruise missles. There is no reason why humans couldn't be taken off the ships (unmanned ships are at least on the drawing board) and then you have robots attacking robots, and humans directing them at least in part from air conditioned offices thousands of miles away.

In terms of the laws of war, indescriminate killing of civilans is already illegal so it seems to me that banning LAWs type weapons is cutting off your nose to spite your face: eliminating something better because it may be misused and turned into something worse...that is alreay illegal.

E.G.: landmines are restricted because the potential harm is an inherrent feature of the technology. LAWs would be banned because of misuse or malfuction even though the intended use of the technology is a societal benefit.
 
Last edited:

Similar threads

Replies
2
Views
107
Replies
49
Views
7K
Replies
84
Views
8K
Replies
4
Views
2K
Replies
1
Views
10K
Replies
7
Views
2K
2
Replies
96
Views
9K
Back
Top