Anyone else a bit concerned with autonomized weapons?

  • Thread starter Thread starter dipole
  • Start date Start date
  • Tags Tags
    Bit
AI Thread Summary
Concerns about autonomous weapons are rising, particularly regarding their potential use in terrorist attacks and mass shootings in urban areas. The discussion highlights the difference between human-controlled drones and autonomous robots, emphasizing that the latter could operate with greater efficiency and lethality. Participants express fears about the ease of manufacturing such robots, which could be concealed and deployed more easily than traditional military drones. The conversation also touches on the ethical implications of lethal autonomous weapons systems (LAWS) and their potential to escalate violence without human oversight. Overall, the thread underscores the urgent need to consider the implications of advancing robotic and AI technologies in warfare and public safety.
  • #51
While we sit here debating the dangers of LAWS the military at least in the US may be divided on its near term implementation. Although the Navy has begun testing and deployment of the Sea Hunter class of anti submarine autonomous unmanned surface vessels but at the same time has reduced the mission goals of the X47B autonomous aircraft from surveillance,and strike, to a refueling role even though it has proved its capabilities. The air force does not appear to have much of an interest in autonomous aircraft programs instead investing heavily in the manned F35 and trying to resurrect the F22, Darpa is playing with drone swarms. There is some concern that the best and the brightest AI experts and roboticists are working for Google, Amazon, Facebook and Apple. Commercial players may make the application breakthroughs before or for the military.
 
Physics news on Phys.org
  • #52
gleem said:
While we sit here debating the dangers of LAWS the military at least in the US may be divided on its near term implementation. Although the Navy has begun testing and deployment of the Sea Hunter class of anti submarine autonomous unmanned surface vessels but at the same time has reduced the mission goals of the X47B autonomous aircraft from surveillance,and strike, to a refueling role even though it has proved its capabilities. The air force does not appear to have much of an interest in autonomous aircraft programs instead investing heavily in the manned F35 and trying to resurrect the F22, Darpa is playing with drone swarms. There is some concern that the best and the brightest AI experts and roboticists are working for Google, Amazon, Facebook and Apple. Commercial players may make the application breakthroughs before or for the military.
Unfortunately, military brass are still humans and subject to normal human failings; namely, putting themselves above the mission. A 24-year old quote from "[Lockheed] Skunk Works" about the Sea Shadow stealth ship:
"Our ship had a 4-man crew...By contrast, a frigate doing a similar job had more than 300 crewmen...A future commander resented having only a 4-man crew to boss around...in terms of an officer's future status and promotion prospects, it was about as glamorous as a tugboat."

...which isn't to say that all of these decisions are made by the military: some are made by Congress for reasons also having little to do with warfighting capability.
 
  • #53
Grands said:
I would say that most of the technology was invented for military use and then transformed in civil use.
This is simply not true.
 
  • #54
russ_watters said:
Technology in warfare over the past 50 years has, primarily, brought us one thing: war is safer.

Really ? I thought technology has made wars between nation states almost impossible, because you know, all the nations involved risk ending up as piles of radioactive rubble.

russ_watters said:
Fewer people are dying, on both sides of conflicts.
That's because direct conflicts between large, roughly equally matched nation states have become very uncommon.

russ_watters said:
Heck, I can forsee a future where we send robots to fight other robots and humans aren't even put at risk. Isn't that a good thing, not a bad thing ?
Well, as long as nuclear weapons are here, you can't attack a country with an army of robots even if that country's robots are less advanced and can be easily be beaten by yours, because the attacked country can resort to the use of nuclear weapons in desperation.

If a war breaks out between your country and a country without nuclear weapons, then that country will definitely put humans on the front lines after exhausting all it's robot soldiers, so you end up killing people anyway.

Lastly, if a war breaks out between countries who are equally matched in their military robots technology and nuclear weapons, a robot vs robot war will be good entertainment for all the people to watch. Unfortunately it will be a huge waste of resources, the respective countries can avoid this by building the best chess engines they possibly can and compete against each other, which will also be good entertainment for chess players and other people in general.

russ_watters said:
Admittedly I only read part of the article since it is long, but I'm not seeing the logic here. How is LAWS logically different - and worse - than a land mine that it represents either a revolution or a threat worthy of being banned?

Weapons have been banned in the past largely due to excessive cruelty (chemical weapons, phosophorous), not for being good at their jobs. Or, more commonly, their use has been restricted in obvious ways: you can't use them where they put civilians at risk.

Simply: why is this bad and can anyone explain why it should be banned?

Landmines are mostly good at stopping an invading force i.e they are defensive in nature. LAWS can be used offensively to capture territory.

russ_watters said:
Note: this question applies to war only, not terrorism.
In the future, chances of a direct war between nation states is only going to decrease, so whether or not you like it, the conflict between nations and terrorist groups are the only conflicts that are likely to continue. If the technology proliferates then terrorists will find it easier to attack nation states by replacing suicide bombers and shooters with agile and fast moving robots.
 
Last edited:
  • #55
russ_watters said:
Perhaps I should have omitted the first few words of the question: I'm aware that LAWS is different from a land mine in that LAWS can weigh facts and make decisions whereas a land mine is stimulus-response only. My question is: How is this worse?

It may *feel* unsettling to think about a weapon that makes decisions, and I suppose people can pass laws/make rules based on whatever they want, but I would hope we would pass laws based on real risk and logic, not feelings alone. Otherwise, we may pass laws that make things worse instead of better.

If there are no humans in the warzone, the robots cannot kill any humans. We're already doing our half: our drone pilots are often halfway around the world from the battles they are fighting in. There is no possibility of them being killed in the battles they are fighting. We are not far from drone vs drone combat and the next step would be robot vs robot combat.

We do already have that in some sense, with LAWS type weapons on ships attacking other autonomous weapons such as cruise missles. There is no reason why humans couldn't be taken off the ships (unmanned ships are at least on the drawing board) and then you have robots attacking robots, and humans directing them at least in part from air conditioned offices thousands of miles away.

In terms of the laws of war, indescriminate killing of civilans is already illegal so it seems to me that banning LAWs type weapons is cutting off your nose to spite your face: eliminating something better because it may be misused and turned into something worse...that is alreay illegal.

E.G.: landmines are restricted because the potential harm is an inherrent feature of the technology. LAWs would be banned because of misuse or malfuction even though the intended use of the technology is a societal benefit.

The very points you raise about the potential benefits of LAWS are addressed by Stuart Russell in his article in the World Economic Forum web article below.

https://www.weforum.org/agenda/2016/01/robots-in-war-the-next-weapons-of-mass-destruction

I will quote his responses in the above article which address your points above, as he explains this in ways far better than I can

"We might think of war as a complete breakdown of the rule of law, but it does in fact have its own legally recognized codes of conduct. Many experts in this field, including Human Rights Watch, the International Committee of the Red Cross and UN Special Rapporteurhttp://wilpf.org/killer-robots-and-the-human-rights-council/, have questioned whether autonomous weapons could comply with these laws. Compliance requires subjective and situational judgements that are considerably more difficult than the relatively simple tasks of locating and killing – and which, with the current state of artificial intelligence, would probably be beyond a robot.

Even those countries developing fully autonomous weapons recognize these limitations. In 2012, for example, the US Department of Defense issued a directive stating that such weapons must be designed in a way that allows operators to “exercise appropriate levels of human judgement over the use of force”. The directive specifically prohibits the autonomous selection of human targets, even in defensive settings."

And for your specific claim that armies of robots will keep humans out of the loop entirely in warfare:

"
But some robotics experts,http://www.unog.ch/80256EDD006B8954/(httpAssets)/54B1B7A616EA1D10C1257CCC00478A59/%24file/Article_Arkin_LAWS.pdf, think that lethal autonomous weapons systems could actually reduce the number of civilian wartime casualties. The argument is based on an implicit ceteribus paribus assumption that, after the advent of autonomous weapons, the specific killing opportunities – numbers, times, locations, places, circumstances, victims – will be exactly the same as would have occurred with human soldiers.

This is rather like assuming cruise missiles will only be used in exactly those settings where spears would have been used in the past. Obviously, autonomous weapons are completely different from human soldiers and would be used in completely different ways – for example, as weapons of mass destruction.

Moreover, it seems unlikely that military robots will always have their “humanitarian setting” at 100%. One cannot consistently claim that the well-trained soldiers of civilized nations are so bad at following the rules of war that robots can do better, while at the same time claiming that rogue nations, dictators and terrorist groups are so good at following the rules of war that they will never use robots in ways that violate these rules."

The article also goes at length to describe how the superiority of autonomous weapons relate to their scalability, which destructive properties similar to biological weapons, tipping the balance away from legitimate states and toward terrorists, criminals and other non-state actors. There is also the issue of strategic instability (basically, LAWS must be highly adaptable to be useful, but this very adaptability makes them highly unpredictable and difficult to control.
 
  • #56
Monsterboy said:
Really ? I thought technology has made wars between nation states almost impossible, because you know, all the nations involved risk ending up as piles of radioactive rubble.
Like Statguy, you're responding to point 2, and I was making point 1.
That's because direct conflicts between large, roughly equally matched nation states have become very uncommon.
Point #1 is true even for wars between a superpower and a non-superpower, which is what most wars are. This point is closer but still more focused on #2 than really addressing #1 directly.

The way we fought in the Gulf and Afghanistan was different from the way we fought in Vietnam -- and the transition began in vietnam, roughly in the middle of it. This is not controversial:
Although the Norden bombsight in World War II increased the accuracy of dropped munitions, bombing had more-or-less remained the same art for the previous 50 years: a “shotgun” approach to accuracy that relied on mass and coverage. The measure of an accuracy of a bomb, known as its CEP or “circular area of probability”, went from hundreds of feet to tens of feet or less. According to the Department of Defense, the Air Force’s use of laser-guided systems would explode, because of the benefits of “fewer bombs, fewer airplanes, lower collateral damage, and increased pinpoint destruction accuracy.”
[emphasis added]
https://understandingempire.wordpress.com/2014/01/15/the-laser-age-in-the-vietnam-war/

You can count civilian deaths or military deaths vs size of forces employed and durations. It is safer to participate in a war on either side where smart weapons are used than it was 50 years ago to participate in a war on either side where smart weapons were not used.
Landmines are mostly good at stopping an invading force i.e they are defensive in nature. LAWS can be used offensively to capture territory.
Agreed. So what is your point? Are you saying this makes LAWs worse somehow? Why? Do you understand why landmines are considered problematic?
In the future, chances of a direct war between nation states is only going to decrease, so whether or not you like it, the conflict between nations and terrorist groups are the only conflicts that are likely to continue. If the technology proliferates then terrorists will find it easier to attack nation states by replacing suicide bombers and shooters with agile and fast moving robots.
I agree with the first part and disagree with the second, at least in the context of the OP. Terrestrial robots are complicated, technologically advanced and expensive, all things problematic for terrorists. The most successful terrorist attack in history used box cutters. Most terrorist attacks use crude improvised explosives. If sophisticated conventional weapons are outside their capabilities, then advanced weapons are just futher outside their capabilities.

The Las Vegas hotel/concert shooting was one guy with a cache of small arms. It would have beem much more effective if he used a small, mounted machine gun. Much more effective still if he had used an automated LAWS system. So why didn't he?

People fear what "might" happen, and I'm asking you guys to put some real, logical thought into why, if it is so easy, it hasn't happened yet. The answer really is simple: imagining something to fear is easy. Actually doing it isn't necessarily easy.
 
Last edited:
  • #57
StatGuy2000 said:
The very points you raise about the potential benefits of LAWS are addressed by Stuart Russell in his article in the World Economic Forum web article below.
From the article.
Imagine turning on the TV and seeing footage from a distant urban battlefield, where robots are hunting down and killing unarmed children. It might sound like science fiction – a scene from Terminator, perhaps. But unless we take action soon, lethal autonomous weapons systems – robotic weapons that can locate, select and attack human targets without human intervention – could be a reality. [emphasis added]
[sigh] I can imagine lots of things. I can imagine bubblegum trees and lollypop fairies. But just because you can imagine something doesn't make it reasonable. Movies are becoming so visually realistic that people are losing track of the line between imagination and reality.

Nowhere in the article does he discuss how robots/computers/ai are actually used today in war and how - realistically - they might be intended to be used in the future. Basing policy on fantasy instead of reality is a sure recipe for bad policy.

I can think of two relatively recent examples where LAWS could have or should have been employed to improve the outcome of an engagement:

The USS Stark was heavily damaged by an Iraqi Exocet missle. The ship was equipped with a LAWS type system: the Phalanx CIWS, which was not enabled during the attack due to human error (it is generally off and is only turned on when needed).

The USS Vincennes shot down an Iranian airliner because of a series of "tunnel vision" type errors that led the crew to believe it was a fighter flying an attack profile. I believe our AEGIS warships are capable of autonomous warfighting operation and a computer certainly would not have made that series of mistakes.

Computers can be programmed to do evil, but they are largely immune to poor decision-making. I think in general the expanded use of computers in decision-making would improve decision making further, not reduce it.

Again: people fear ceding control to computers in all sorts of situations, but these fears are not rational and not borne out by case studies. Problems due to poor computer decision-making are relatively rare and the closest thing is usually poor human-computer interface causing the humans to make the bad decisions (several recent plane crashes were caused in part by this).

Wait, my imagination is tingling: maybe we should fear airplane autopilots going roque and purposely crashing airplanes into buildings too?

The article gets somewhat better as it goes along, but not enough (while it is common for clickbait, it is not a good journalistic practice to start an article ridiculous and then improve: you set your first impression and it is hard to break):
article said:
But some robotics experts, such as Ron Arkin, think that lethal autonomous weapons systems could actually reduce the number of civilian wartime casualties. The argument is based on an implicit ceteribus paribus assumption that, after the advent of autonomous weapons, the specific killing opportunities – numbers, times, locations, places, circumstances, victims – will be exactly the same as would have occurred with human soldiers.

This is rather like assuming cruise missiles will only be used in exactly those settings where spears would have been used in the past.
It is indeed the logic, and it is imperfect but real logic with real historical precedent, as I pointed out above: smart weapons such as cruise missiles and laser/gps guided bombs replaced saturation bombing and reduced military and civilian casualties in those situations as a result. That's a fact. We don't have to imagine it. It actually happened.
article said:
Moreover, it seems unlikely that military robots will always have their “humanitarian setting” at 100%. One cannot consistently claim that the well-trained soldiers of civilized nations are so bad at following the rules of war that robots can do better...
This is more of his bias that causes him to miss the point. He sees war as just evil and as a result is incapable of getting inside the minds of actual soldiers. Good nations doing evil isn't being claimed by proponents of LAWS. What is claimed - what actually happens - is that robots (computers) make better and more accurate decisions than humans and as a result, the good guys will fight cleaner/safer wars, as is already happening.
...while at the same time claiming that rogue nations, dictators and terrorist groups are so good at following the rules of war that they will never use robots in ways that violate these rules.
And nobody claims that either. But this is just the "all weapons are bad so we should ban all weapons" logic that doesn't work anywhere. It doesn't work any better for nukes than it works for knives. You cannot stop technology. You cannot stop weapons from being developed and used except by making better weapons to win the wars against the bad guys in order to convince them they shouldn't use their weapons; as @Monsterboy correctly pointed out, the technological/capabilities disparity is likely part of the reason the number and scope of wars is decreasing. In that sense, robot warfare would simply increase that dispartiy, further reducing the number and severity of wars.

Major world powers stopped using chemical weapons because we recognize they are bad. Syria used them recently because they corectly predicted they wouldn't be punished for using them. And chemical weapons are low-tech and relatively cheap. Worrying about roque nations using advanced killer robots when they are still using 1910s technology makes little sense.

So again; unlike chemical weapons or land mines, which are incapable of identifying and sparing civilians, LAWS type systems have no inherrent feature that makes them dangerous to civilians. Their danger is the same as any weapon's danger: that a roque will use it to harm civilians. Thus they are already covered adequatly under existing law, thus they need no special laws to cover them.
 
  • Like
Likes HAYAO
  • #58
Just a note in case it isn't obvious:
We are in large part talking past each other. My position is that this technology will continue to enable *us* to fight wars better. Without responding to my position, the responses I'm getting are "this will enable our adversaries to fight wars worse".

Even if the second position is true, it does not address, much less negate, the first position.
 
  • #59
Sorry for the multiple posts, but I feel like I need to point out that this argument from the article is logically void (empty):
But some robotics experts, such as Ron Arkin, think that lethal autonomous weapons systems could actually reduce the number of civilian wartime casualties. The argument is based on an implicit ceteribus paribus assumption that, after the advent of autonomous weapons, the specific killing opportunities – numbers, times, locations, places, circumstances, victims – will be exactly the same as would have occurred with human soldiers.

This is rather like assuming cruise missiles will only be used in exactly those settings where spears would have been used in the past. Obviously, autonomous weapons are completely different from human soldiers and would be used in completely different ways – for example, as weapons of mass destruction.
The conclusion is just made-up and has no logical relationship with the premise. You can see it easily enough if you flip it over:

As stated (paraphrase):
1. Precision weapons could decrease collateral damage but will increase collateral damage by being used more or worse.

Flipping it:
2a. Indescriminate weapons could increase collateral damage but will decrease collateral damage by being used less and better.

I wouldn't argue #2 and I doubt the author would either, even though it is the exact inverse of the argument he's making. Instead, I suspect he'd just argue:
2b. Indescriminate weapons would increase collageral damage.

So in this "logic", all paths lead to more deaths. Thus the "logic" doesn't actually follow and the purpose of the argument isn't to find a logical conclusion, but to stalemate/handcuff the argument. This is what you get when you have an "all weapons are bad" worldview: you can't look at any other possibility than that any change will make things worse. You can't have a rational discusison with someone who thinks all paths - even those in exact opposite directions - lead to the same place.

And this is not a trivial problem with his argument because you can apply the argument retrospectively, to history and where we are today: Does he really want to go back to the age of carpet-bombing? Does he really believe people will be safer? Does he really not see that wars are safer today?

You guys may think that the issue of whether the enemy gets our technology is all that matters, but the issue of how technology is *actually* used by *our* soldiers/sailors in war needs to be addressed too. I was in the US Navy, stationed on a ship that was equipped with a LAWS system. What is being suggested here would mean removing those systems from ships and increasing the risk of sailors such as myself dying from attacks that could have been defended.
 
  • #60
russ_watters said:
Like Statguy, you're responding to point 2, and I was making point 1.

Ok, let's talk about point 1. Safer war

You are saying that robots attack robots and all the humans will be completely out of the battleground right ?

Well, this will only happen when all the countries possesses this technology to a roughly same degree, the reality is that different countries will be at different levels when it comes to developing and using LAWS, so in order to use robots in war(and not kill people) you will have to wait till all other countries catch up with you ! or else you will have a lot of blood on your hands, or rather on your robot's hands.

I think i covered this topic here.
Monsterboy said:
If a war breaks out between your country and a (less advanced)country without nuclear weapons, then that country will definitely put humans on the front lines after exhausting all it's robot soldiers, so you end up killing people anyway.

If we get to the point where the countries involved in a war are roughly equal in terms of developing LAWS, then wars are going to be much more prolonged and much more damaging to the economy and the environment, as there are no human lives involved and countries can endlessly manufacture these machines. But in war consisting of human soldiers, there will a huge pressure to not start a war in the first place and to stop the war as soon as possible because human lives will be at stake along with the damage to the economy and the environment.

russ_watters said:
Agreed. So what is your point? Are you saying this makes LAWs worse somehow? Why? Do you understand why landmines are considered problematic?
Well, a world full of countries with offensive capabilities is less stable than a world where most countries are possessing mostly defensive capabilities.

I think i understand why landmines are problematic.
1. once they are deployed, they are difficult to get rid off both for the host nation and the invading nation.
2. you don't usually get to decide whether they explode or not after being deployed.

But given the above points landmines do not give any country the power exert force on other countries.

russ_watters said:
Terrestrial robots are complicated, technologically advanced and expensive, all things problematic for terrorists. The most successful terrorist attack in history used box cutters. Most terrorist attacks use crude improvised explosives. If sophisticated conventional weapons are outside their capabilities, then advanced weapons are just futher outside their capabilities.

The Las Vegas hotel/concert shooting was one guy with a cache of small arms. It would have beem much more effective if he used a small, mounted machine gun. Much more effective still if he had used an automated LAWS system. So why didn't he?

People fear what "might" happen, and I'm asking you guys to put some real, logical thought into why, if it is so easy, it hasn't happened yet. The answer really is simple: imagining something to fear is easy. Actually doing it isn't necessarily easy.

The threat of terrorists using such robots is not an immediate one nor is it in the near future, I agree. Right now even major powers are yet use fully autonomous robots. My point is that only after autonomous military robots become a regular thing that terrorists can get their hands on it. If you think about it building a nuclear weapon will harder than building a robot because they won't need any "hard to get" materials that are required to build nukes.
 
  • #61
If I was a terrorist, I wouldn't be stupid enough to do terror attacks on civilians using smuggled/stolen autonomous military robots. For one, it would be easy to be countered by law enforcements and counter-terrorist military units, because they basically have the same, or even better. And two, it is not worth the effort and money for smuggling such thing, only to be destroyed for sure by counter-terrorist.

Sure, you can probably kill more with autonomous military robots than spraying your assault rifle with your friends, but bomb attacks are more efficient, simple, can be hidden well, and cheaper (and can be self-made).
 
  • #62
HAYAO said:
I wouldn't be stupid enough to do terror attacks on civilians using smuggled/stolen autonomous military robots. For one, it would be easy to be countered by law enforcements and counter-terrorist military units, because they basically have the same, or even better.
Then you are not an ambitious terrorist, look at the Taliban, the US and Afghan military are better equipped in every way but still it has proven to be quite difficult to eliminate the Taliban. Why is that ? Because such terrorists are not just thinking of blowing up buildings or cars, that is just their way of "sending a message" or "spreading fear", their real goal is to occupy territory and enforce their rules over other people. Combat robot can be a good tool to use for a guerrilla army, they can attack where they are least expected cause a lot of damage and run away i.e hit and run tactics, before being taken down by the "superior" counter-terrorist forces, even if they are "beaten" by counter terrorist forces, it wouldn't matter for them, why would it ? Robots can be sent on suicidal missions just like humans.
 
  • #63
Monsterboy said:
Then you are not an ambitious terrorist, look at the Taliban, the US and Afghan military are better equipped in every way but still it has proven to be quite difficult to eliminate the Taliban. Why is that ? Because such terrorists are not just thinking of blowing up buildings or cars, that is just their way of "sending a message" or "spreading fear", their real goal is to occupy territory and enforce their rules over other people. Combat robots can be a good tool to use for a guerrilla army, they can attack where they are least expected cause a lot of damage and run away i.e hit and run tactics, before being taken down by the "superior" counter-terrorist forces, even if they are "beaten" by counter terrorist forces, it wouldn't matter for them, why would it ? Robots can be sent on suicidal missions just like humans.
This misses my point. I am talking about terrorist attacks on civilians. Insurgents attacks are different. But if you want to argue this, then fine.

I am by no means a military personnel, but it's not that hard to imagine the difficulties in fighting Talibans and insurgents from the military perspective, unless you are ready for genocide (like in some countries).

The reason why it is difficult to wipe out Taliban is because of the rule of engagement. One cannot be certain whether one is an insurgent or not unless they either engage you first or you have good proof and intelligence that they are. This is especially true when firearms can be hidden rather easily in real combat, and people can camouflage like a normal civilian. But military robots? No, once you see a military robots in a place they aren't supposed to, then it's safe to judge they are enemies.
 
  • #64
In the near term say next 10 years it is very unlikely to have an autonomous anthropomorphic robot engaging humans or one another. AI is and will continue to be applied to weapon systems as ships, plane-drones, vehicles or as perimeter defense systems for surveillance , reconnaissance and probably used offensively autonomously until it engages the enemy.after which humans will probably pull the trigger. I can see in the near term the possibility of using truly autonomous weapons where because of the situation as a forward attacking force in which only enemy forces should be present there is a reduced need to verify the identity of a target. Sending a drone into a cave containing possible combatants. or enforcing curfews with "non lethal" force. In fact a Texas company called Chaotic Moon sells an autonomous surveillance drone (CUPID) that can taser a trespasser who refuses to leave. The company Taser is looking into developing similar devices for law enforcement.
 
  • Like
Likes Monsterboy
  • #65
HAYAO said:
This misses my point. I am talking about terrorist attacks on civilians. Insurgents attacks are different. But if you want to argue this, then fine.

I was talking about attacks on civilians as well.
HAYAO said:
I am by no means a military personnel, but it's not that hard to imagine the difficulties in fighting Talibans and insurgents from the military perspective, unless you are ready for genocide (like in some countries).
That's exactly why Terrorists + autonomous robots = even more trouble.

HAYAO said:
The reason why it is difficult to wipe out Taliban is because of the rule of engagement. One cannot be certain whether one is an insurgent or not unless they either engage you first or you have good proof and intelligence that they are. This is especially true when firearms can be hidden rather easily in real combat, and people can camouflage like a normal civilian. But military robots? No, once you see a military robots in a place they aren't supposed to, then it's safe to judge they are enemies.
What makes you think camouflaging military robots is going to be so difficult ?

https://www.theaustralian.com.au/news/world/scientists-build-robot-that-can-crawl-camouflage-itself-and-hide/news-story/4987d2f99fa111e5745f6aaba12513d9?sv=b493f1dd644fef0aec46f0527704f228
 
  • #66
Monsterboy said:
I was talking about attacks on civilians as well.

That's exactly why Terrorists + autonomous robots = even more trouble.What makes you think camouflaging military robots is going to be so difficult ?

https://www.theaustralian.com.au/news/world/scientists-build-robot-that-can-crawl-camouflage-itself-and-hide/news-story/4987d2f99fa111e5745f6aaba12513d9?sv=b493f1dd644fef0aec46f0527704f228
First, I think you are watching way too much movies and games. Second, you underestimate so many things.

These type of weapons are extremely pricey, especially if you are going to use camouglaging technologies. We are talking about millions of dollars each unit. Realistically speaking, how are they going to smuggle such weapon? How many times have terrorists smuggled fighter/attacker jets and effectively used it? How many times have terrorists smuggled humvees and effectively used it? Tanks? Can they maintain it? Can they utilize their full potential?

Smuggling weapons can only be done for things that are small enough, easily and well produced, and cheap enough for third-world nations to manufacture it. Rifles, rocket-propelled grenade launchers, light and heavy machineguns are basically the limit in size and price for smuggling with reasonable amount of numbers. If you are thinking one single good weapon can change the entire tide of a battle, you are terribly mistaken. You need tens and hundreds, sometimes even thousands to make it work.

You said they can be used for suicide missions. Do you really think terrorists are going to just send a machine to a suicide mission that they had such a hard time smuggling even one unit? That would be a very unwise decision to make and certainly not worth the cost.

Meanwhile you have bombs that can be hand-made. You have rifles that are available in a lot of places in the world. You have people that you can use for suicide missions. All of these don't cost much, and is practically much more useful than smuggling mil-spec military weapons.
AIs are much better decision makers than human. As russ_waters said in his posts, a lot of military errors during combat occur from poor-decision making on the human part. As such, AIs are better combatant in terms of identifying enemy and deciding whether or not to engage them. Using autonomous robots in guerrilla warfare where they have become widely available is going to be much less effective than when without them. Guerrilla warfare works because there is huge gap in preparation between one utilizing guerrilla warfare and one being engaged. They utilize the downside of human combatant: the inability to detect enemy presence in complex terrain and environment, poor-decision making, and lack of ability in parallel thinking and analyzing especially under stress. However, autonomous robots/AIs can significantly close this gap because the defender has more ways to detect and analyze an enemy before and after being engaged.
 
Last edited:
  • #67
HAYAO said:
First, I think you are watching way too much movies and games. Second, you underestimate so many things.
I wish i had enough time to do that. If you read my previous response to russ and contemplate on that you will find that I underestimate nothing. In fact you and others underestimate, misunderstand what terrorists are and what they aim for and their relationship with nation states.

HAYAO said:
These type of weapons are extremely pricey, especially if you are going to use camouglaging technologies. We are talking about millions of dollars each unit. Realistically speaking, how are they going to smuggle such weapon? How many times have terrorists smuggled fighter/attacker jets and effectively used it? How many times have terrorists smuggled humvees and effectively used it? Tanks? Can they maintain it? Can they utilize their full potential?

I repeat again terrorists are not just about blowing up cars and buildings. There is a complicated relationship between nation states and terrorist groups, there are nations which are known to share technologies and money with terrorists. For example the rebels in Ukraine are financed and armed by Russia, that is how they managed to bring down a passenger plane, remember that ?

Pakistan is believed to finance and arm terrorists who attack targets in Afghanistan (while being a ally of the US) and Indian controlled disputed region called Kashmir. If you go to the Middle east, the situation is even worse, several countries and many terrorist groups are involved in the violence, many of the countries in this region are known to support and arm certain terrorist groups in order to cause trouble in other countries.

HAYAO said:
Smuggling weapons can only be done for things that are small enough, easily and well produced, and cheap enough for third-world nations to manufacture it. Rifles, rocket-propelled grenade launchers, light and heavy machineguns are basically the limit in size and price for smuggling with reasonable amount of numbers. If you are thinking one single good weapon can change the entire tide of a battle, you are terribly mistaken. You need tens and hundreds, sometimes even thousands to make it work.

You said they can be used for suicide missions. Do you really think terrorists are going to just send a machine to a suicide mission that they had such a hard time smuggling even one unit? That would be a very unwise decision to make and certainly not worth the cost.

Meanwhile you have bombs that can be hand-made. You have rifles that are available in a lot of places in the world. You have people that you can use for suicide missions. All of these don't cost much, and is practically much more useful than smuggling mil-spec military weapons.

I am afraid all these points already covered in my previous response to russ here.
Monsterboy said:
The threat of terrorists using such robots is not an immediate one nor is it in the near future, I agree. Right now even major powers are yet use fully autonomous robots. My point is that only after autonomous military robots become a regular thing that terrorists can get their hands on it. If you think about it building a nuclear weapon will harder than building a robot because they won't need any "hard to get" materials that are required to build nukes.

HAYAO said:
AIs are much better decision makers than human. As russ_waters said in his posts, a lot of military errors during combat occur from poor-decision making on the human part. As such, AIs are better combatant in terms of identifying enemy and deciding whether or not to engage them.
What makes you think these robots will be concerned about their own survival ? They can choose to attack even if they won't make it out alive and in one piece. Robots can be made suicidal if cornered or damaged beyond recovery. Why is this hard to understand ?

Did the gunmen who attacked Paris and Mumbai planned to get out alive ? No, they wanted to cause as much damage as possible. Robots are capable of doing the same.

HAYAO said:
Guerrilla warfare works because there is huge gap in preparation between one utilizing guerrilla warfare and one being engaged. They utilize the downside of human combatant:
Yes, i know what guerrilla warfare means.
That's why they are good weapons to attack civilian targets and relatively weak military ones.
 
Last edited:
  • #68
Monsterboy said:
My point is that only after autonomous military robots become a regular thing that terrorists can get their hands on it.
I think you has some misconceptions regarding the realistic issues.

The price tag on the supposed 'military grade autonomous robots' will be quite high, since such stuff is expected to met the standards of military, at least on the level of a common soldier. All terrain, all weather, durable, low maintenance, long operation time, and what is the most important these times: able to satisfy all the armchair-generals, journalists and human right fighters of the world at the same time.

But practically none of that matters for a terrorists and/or for the neglected category of 'idiots'. So, the debate here actually should have two different questions.
  • what's it with autonomous weapons which can be accepted into the military?
  • What's it with cheap, crude homemade replicas/experiments/attempts of half- or full autonomized weapons and weapon platforms?
 
Last edited:
  • #69
Rive said:
The price tag on the supposed 'military grade autonomous robots' will be quite high, since such stuff is expected to met the standards of military, at least on the level of a common soldier. All terrain, all weather, durable, low maintenance, long operation time, and what is the most important these times: able to satisfy all the armchair-generals, journalists and human right fighters of the world at the same time.

Yes, the idea in this thread of the people who support LAWS is that we can completely remove humans from the battleground and also avoid civilian causalities i.e it will be a safe war, this means mass manufacturing of these robots, even if a single robot can replace 2 or 3 soldiers ,we are going to have a lot a robots. If the price tag doesn't allow this to happen then the whole point of a safe war is eliminated and we are going to have humans on the battleground.

I assumed that a "safe war" is only going to happen when or if the world's military powers find it affordable to replace all their human soldiers with autonomous robots, am i wrong ? That means the cost of maintaining an army,navy and airforce of autonomous robots has reduced to or is lesser than the cost of maintaining a human military force, when this happens why would it be impossible for terrorists who don't even maintain a regular army to acquire these weapon systems even if they are not the best available with/without state support ?

You yourself say that
Rive said:
...practically none of that matters for terrorists and/or for the neglected category of 'idiots'.

This means at some point in the future when a "safe war" can happen, terrorists who don't have to satisfy all the requirements you stated are going to spend much less money on their robots than what major military powers spend on theirs.

Rive said:
So, the debate here actually should have two different questions.
  • what's it with autonomous weapons which can be accepted into the military?
  • What's it with cheap, crude homemade replicas/experiments/attempts of half- or full autonomized weapons and weapon platforms?
At first they might appear to be two different questions but they are not entirely different because of the points i have mentioned above. Yes, terrorists are mostly going to use weapon systems which are primitive to the ones used by governments (just like now), that will be enough for them to attack civilians targets and weakly defended military ones, just like they are doing now.
 
Last edited:
  • #70
Monsterboy said:
That means the cost of maintaining an army,navy and airforce of autonomous robots has reduced to or is lesser than the cost of maintaining a human military force
No. The motive behind the migration toward robots is not about unit price, but about the acceptability of unit loss.
It's a shrug to accept the loss of a robot: however, it takes plenty of tears, loss of political support and many inconvenient questions to lose a soldier.

And there is nothing in progress what would suggest the required endless drop in unit manufacturing costs of advanced military hardware just due the mass production.

Monsterboy said:
At first they might appear to be two different questions
Till somebody manages to smuggle some armed T72's (or anything comparable) from Iraq to USA (or, in general: to the West) they ARE two different questions.
 
  • #71
Rive said:
No. The motive behind the migration toward robots is not about unit price, but about the acceptability of unit loss.
It's a shrug to accept the loss of a robot: however, it takes plenty of tears, loss of political support and many inconvenient questions to lose a soldier.

And there is nothing in progress what would suggest the required endless drop in unit manufacturing costs of advanced military hardware just due the mass production.

If robots can do soldiers' work equally well or better, then they will be preferred over humans for the reason you stated. However, if the robots are much too expensive to develop, mass produce and maintain and/or are not good enough to replace humans on the ground, most countries will prefer to have humans as soldiers (maybe along with some robots) and the concept of a safe war will remain in fiction.

The consequence of the above will lead to only the most advanced and rich countries possessing a fully autonomous military force and other countries will continue to have a very significant human presence in the battlefield, this will lead to a huge imbalance of power where the advanced countries don't mind replacing their robots in case of loss and other countries will be totally at their mercy. This will justify banning of autonomous military robots.

Rive said:
Till somebody manages to smuggle some armed T72's (or anything comparable) from Iraq to USA (or, in general: to the West) they ARE two different questions.

Terrorists don't need tanks irrespective of whether they can smuggle them, because they are into guerrilla tactics, they don't maintain a regular army. There is no reason to think that autonomous military robots are going to be so big and bulky.
 
Last edited:
  • #72
Terrorists/Robbers/criminals have low resource, frequently have no cash to sustain themselves and their goal. So they are going to use the less expensive tools to get to what they want. They are not going to use anything sophisticated like these robots (only if they are being funded by the government, but that's another story). If not true, then we would have nuclear bombs exploding out by the current terrorist groups.
 
  • Like
Likes HAYAO and russ_watters
  • #73
kent davidge said:
Terrorists/Robbers/criminals have low resource, frequently have no cash to sustain themselves and their goal.
The concern is not about local thugs and pick-pockets. The concern is about organised crime units and terrorist networks that spread across the globe and get a lot of money from people in many countries and sometimes state-funded. This concern, as mentioned earlier several times in this thread is not an immediate one.
 
  • #74
Sorry for the late reply and I only have a bit of time, but wanted to respond to one thing:
Monsterboy said:
Ok, let's talk about point 1. Safer war

You are saying that robots attack robots and all the humans will be completely out of the battleground right ?

Well, this will only happen when all the countries possesses this technology to a roughly same degree...
This simply isn't true and isn't at all close to what I said. Having only robot-on-robot combat is a utopian ultimate endpoint, but every step along the way including steps already taken reduces the deadliness of wars.
 
  • Like
Likes HAYAO
  • #75
russ_watters said:
This simply isn't true and isn't at all close to what I said. Having only robot-on-robot combat is a utopian ultimate endpoint...

Well, i have quoted a few of your posts that made me think that robot-on-robot combat is the aim behind your support for LAWS, even if it is not possible in the immediate future.

russ_watters said:
Heck, I can forsee a future where we send robots to fight other robots and humans aren't even put at risk.

russ_watters said:
If there are no humans in the warzone, the robots cannot kill any humans. We're already doing our half: our drone pilots are often halfway around the world from the battles they are fighting in. There is no possibility of them being killed in the battles they are fighting. We are not far from drone vs drone combat and the next step would be robot vs robot combat.

But, maybe you feel I should have paid more attention to this...
russ_watters said:
You can count civilian deaths or military deaths vs size of forces employed and durations. It is safer to participate in a war on either side where smart weapons are used than it was 50 years ago to participate in a war on either side where smart weapons were not used.

russ_watters said:
but every step along the way including steps already taken reduces the deadliness of wars.
Smart weapons need not be fully autonomous to get the advantages that you have mentioned.

There is another thing I missed out, perhaps more important than other concerns.

The problem with fully autonomous weapons is that it is difficult to know who is really in control of them. Can you have one head of the air-force and a hand full subordinates who control an entire airforce consisting of perhaps hundreds or thousands of drones ? similarly in the army and the navy ? Can you trust a few individuals with that kind of power ? Individual human soldiers can get to decide where their loyalty lies. LAWS simply do what they are told to do.

For example in Turkey
https://en.wikipedia.org/wiki/2016_Turkish_coup_d'état_attempt
During the coup, over 300 people were killed[38] and more than 2,100 were injured. Many government buildings, including the Turkish Parliament and the Presidential Palace, were bombed from the air.[59] Mass arrests followed, with at least 40,000 detained,[38][60] including at least 10,000 soldiers

Let's fast forward to a future where, the human element in the military is reduced a number around the ones mentioned above or even less than that. How do you think a coup attempt would end ?

I think this is addressed in the below quote quite well.
StoneTemplePython said:
Another issue: suppose you have a wobbly dictatorship and civil strife. Some dictators clamp down on protesters by ordering the army to shoot them down. Sometimes this works, sometimes it doesn't -- a lot of soldiers have qualms about slaughtering their own people. Machines (Terminators?) have no such qualms and simply execute orders. There are similar issues during coups -- failed and successful ones.

As usual, machines allow you to scale things in a way that humans don't. This should be very spooky stuff
 
Last edited:
  • #76
Monsterboy said:
The problem with fully autonomous weapons is that it is difficult to know who is really in control of them. Can you have one head of the air-force and a hand full subordinates who control an entire airforce consisting of perhaps hundreds or thousands of drones ? similarly in the army and the navy ? Can you trust a few individuals with that kind of power ? Individual human soldiers can get decide where their loyalty lies. LAWS simply do what they are told to do.

One could envision that the use of autonomous weapons could accelerate the tempo of a war to the extent that only a military AI system can orchestrate the strategies and tactics. I am sure as I write this that such systems are under development in anticipation of this.
 
  • Like
Likes Monsterboy
  • #77
Monsterboy, first there is a bigger issue here. I want to make an official complaint that you are misreading and misinterpreting people's words to your advantage. You keep accusing others of something they have not said, and then respond based on that false accusations. You also tend to argue passed them without accounting for the actual point that the other side is saying. I feel that this is not a very fair thing to do.

For example,
Monsterboy said:
Terrorists don't need tanks irrespective of whether they can smuggle them, because they are into guerrilla tactics, they don't maintain a regular army. There is no reason to think that autonomous military robots are going to be so big and bulky.
That is not what Rive is saying. In fact, you just proved him right. He is asking the realistic aspect of the issue. T72 is a great war machine, but like you said, terrorists don't need it. It's not practical for the specific role that the terrorists often use a weapon for. Similarly, autonomous military robots for a military role will be of no use for the terrorists because they will likely not be built for that specific role. Even if they really do need one, they won't be able to build even the most primitive ones because autonomous military robots are certainly going to be technologically demanding both in the principle of operation and the production precision. Even the US military still can't make practical autonomous military robots of infantry level yet. That is the kind of technology we are talking about. So their only option is to smuggle one or hope for some party to provide them one, which is also very impractical. This was also my point in my first post in this thread.

We are not in some science fiction war movie right now. Please talk about the realistic issues.
As a side note, the downing of a civilian aircraft by Ukraine rebels cannot easily be classified as terrorism because they most likely had no intention of shooting down a civilian aircraft. They likely mistook the plane for Ukrainian military aircraft (they have shot down Ukrainian military aircraft several times). And of course the Ukrainian government is considering the rebels as terrorists (and the only government to officially designate them as so), but it is not very easy to designate these rebels (or militia or whatever you want to call it) as terrorists.
 
  • #78
HAYAO said:
Monsterboy, first there is a bigger issue here. I want to make an official complaint that you are misreading and misinterpreting people's words to your advantage.

Lol, go ahead with the complaint, let's see how that works out. I am not afraid of false accusations.

HAYAO said:
That is not what Rive is saying. In fact, you just proved him right. He is asking the realistic aspect of the issue. T72 is a great war machine, but like you said, terrorists don't need it. It's not practical for the specific role that the terrorists often use a weapon for.

I just pointed out that Rive gave me a wrong example, let's discuss...

The realistic issues of smuggling a T72s are

1. A T72 doesn't even into fit a truck, let alone a car or a suitcase. It is very big and bulky hence very difficult or impossible to move it under the nose of authorities.

2. Even after assuming the above issue is solved. A T72 cannot easily be hidden inside an urban area, it takes too much space, a small mistake will make it detectable from air. They cannot be hidden easily hidden in areas where the guerrilla fighters lurk around.

Read this answer about why tanks aren't a weapon of choice for terrorists.
https://www.quora.com/Do-guerrilla-fighters-fear-tanks-AFVs

My question to Rive was, what makes him think that in the future in which LAWS become a common thing they are going to have the same disadvantages of today's tanks ?

HAYAO said:
As a side note, the downing of a civilian aircraft by Ukraine rebels cannot easily be classified as terrorism because they most likely had no intention of shooting down a civilian aircraft. They likely mistook the plane for Ukrainian military aircraft (they have shot down Ukrainian military aircraft several times).
My point here was not about whether they wanted to do it, it's about them getting their hands on such weapon systems which they could not have developed themselves. In a future where LAWS become a common thing rouge states can share their equipment with "non-state actors" just like this.

HAYAO said:
And of course the Ukrainian government is considering the rebels as terrorists (and the only government to officially designate them as so), but it is not very easy to designate these rebels (or militia or whatever you want to call it) as terrorists.

Whether a group is considered as "terrorists" or "rebels" or "freedom fighters" etc depends a lot on the which country you are in.

For example: Israel may consider Hamas as "terrorists", Palestine might consider them as "freedom fighters". The soviets may have considered the Mujaheddin as "terrorists", but for ordinary afghans and Americans they were "freedom fighters". In India government considers the people who cause violence in Kashmir as "terrorists" but some of the locals and Pakistanis call them as "rebels" or "freedom fighters". So, it all depends on who you ask.

What they are called as is the not at all the issue here.
 
  • #79
Monsterboy said:
Lol, go ahead with the complaint, let's see how that works out. I am not afraid of false accusations.
That is a very bad attitude and an arrogant stance to have, especially when you are quick to respond in a way I accused you of, right after I accused you.
I just pointed out that Rive gave me a wrong example, let's discuss...

The realistic issues of smuggling a T72s are

1. A T72 doesn't even into fit a truck, let alone a car or a suitcase. It is very big and bulky hence very difficult or impossible to move it under the nose of authorities.

2. Even after assuming the above issue is solved. A T72 cannot easily be hidden inside an urban area, it takes too much space, a small mistake will make it detectable from air. They cannot be hidden easily hidden in areas where the guerrilla fighters lurk around.

Read this answer about why tanks aren't a weapon of choice for terrorists.
https://www.quora.com/Do-guerrilla-fighters-fear-tanks-AFVs

My question to Rive was, what makes him think that in the future in which LAWS become a common thing they are going to have the same disadvantages of today's tanks ?My point here was not about whether they wanted to do it, it's about them getting their hands on such weapon systems which they could not have developed themselves. In a future where LAWS become a common thing rouge states can share their equipment with "non-state actors" just like this.
Please read and comprehend correctly what other people wrote. No one is using any poetry form of writing, so it's shouldn't be so hard to comprehend what people are saying here.

Here is what Rive is saying in his post in simple English:
1) What kind of autonomous military robots are going to be used in a military?
2) What kind of autonomous military robots can terrorists improvise?

These are rhetorical questions, if you haven't figured it out yet. He was being sarcastic when he mentioned T72s. You are welcome.
Whether a group is considered as "terrorists" or "rebels" or "freedom fighters" etc depends a lot on the which country you are in.

For example: Israel may consider Hamas as "terrorists", Palestine might consider them as "freedom fighters". The soviets may have considered the Mujaheddin as "terrorists", but for ordinary afghans and Americans they were "freedom fighters". In India government considers the people who cause violence in Kashmir as "terrorists" but some of the locals and Pakistanis call them as "rebels" or "freedom fighters". So, it all depends on who you ask.

What they are called as is the not at all the issue here.
Well you are the one who mentioned how Russia was funding rebel militias in Ukraine, so it is not hard to tell that you consider them a terrorist group. Hence, I had to provide you the insight because you seems to be making very fatal errors in your arguments.
 
  • #80
In the interest of this thread which addresses an important topic, I refuse to participate in a muck fest by responding to certain ill-conceived posts.

I am waiting for @russ_watters to respond to post #75. I am done with other people here.
 
Last edited:
  • #81
HAYAO said:
He was being sarcastic when he mentioned T72s
Just partially. T72, as a no-longer-top 'hot military hardware' is a great example - it is exactly that kind of stuff what we supposed (??!) to find at hands of terrorists (here).

There are other possible examples, especially since plenty of military hardware ended in hands of groups associated with terrorism in Middle East. But practically none of them made it back to 'West'.
 
  • Like
Likes HAYAO
  • #82
Sorry, couldn't resist.

Rive said:
T72, as a no-longer-top 'hot military hardware' is a great example
It may not top notched for the US or NATO but a lot of countries still use them and even older versions like T-55 are still in use in some countries. But their very design is not suited for non-state actors.

Rive said:
- it is exactly that kind of stuff what we supposed (??!) to find at hands of terrorists (here).
Like I said they are not a weapon of choice for terrorists and like you said it's not something that can be smuggled.

You will find anti-tank weapons at the hands of terrorists (today).
https://medium.com/war-is-boring/te...i-tank-guided-missiles-than-ever-4c8be96ea105
Terrorist Groups Now Have More Anti-Tank Guided Missiles Than Ever
Tank killers spread throughout the Middle East

The Kornet system uses a laser beam to direct a missile toward its target at listed ranges of up to 5.5 kilometers.

http://www.independent.co.uk/news/u...apon-of-choice-arm-it-fire-it-run-698901.html
the weapon weighs 7.5kg and can fold in half to easily fit under a coat or into the boot of a car. It is now mainly used in developing countries and would be targeted against lightly armoured vehicles.
Post #42 gives an idea of low cost, low weight autonomous vehicles that can be weaponized.
Rive said:
But practically none of them made it back to 'West'.
This is not just about the West.
 
Last edited:
  • #83
Rive said:
Just partially. T72, as a no-longer-top 'hot military hardware' is a great example - it is exactly that kind of stuff what we supposed (??!) to find at hands of terrorists (here).

There are other possible examples, especially since plenty of military hardware ended in hands of groups associated with terrorism in Middle East. But practically none of them made it back to 'West'.
Yes, I know what you meant because of your choice talking about T72, which is a second generation main battle tank. And we are somehow supposed to see them in the hands of terrorists and be used effectively on this thread...:rolleyes:
 
  • #84
Some members on this thread are finding it extremely difficult to understand why terrorists don't want tanks. I would like to help out here.
https://www.quora.com/Do-guerrilla-...r/Roland-Bartetzko?share=7069f96f%6&srid=E7fv

Guerrillas usually operate in very difficult terrain like mountains, swamps or dense forests. Tanks can’t operate there.

Further, tanks are designed to fight other tanks and not infantry. You can hear them coming from far away which means that you won’t be surprised by them and you have plenty of time to disappear, in case you have no anti tank weapons or you simply don’t want to engage in combat right now.

I hope this clears the issue of why we don't find terrorists who want tanks.
 
  • Like
Likes berkeman
  • #85
I don't know what you don't understand, but no one here is talking about how terrorists really want tanks. We are talking about how terrorists don't want tanks, and how that applies to the main part of this discussion.
 
  • #86
Actually terrorist will use what ever they can get their hands on eg about six Abram M1 tank ended up in ISIS hands when the Iraqi army left them in retreat early in the ISIS assaults. I have stated earlier that autonomous drones are commercially available that have collision avoidance capability. About 80% of casualties in the Iraqi were where due to IED's by insurgents who surprised us by employing everything from cell phones to garage door openers. An Israeli autonomous radar seeking missile system HAROP was sold to Azerbaijan. A missile believed to be a HAROP destroyed a bus load of Armenians in a disputed territory between Armenia and Azerbaijan. Did the bus have radar? Was it re-engineered for a new purpose? How easy is it to re-engineer AI systems? Maybe not too hard. Anyway, never assume what your enemy is not capable of doing.

Today weapons systems have developed to the extent that if you can see it or know where it is you can probably hit it. We even have smart bullets and to shoot at something that is Km's away is like shooting from 100 ft. Potential autonomous AI systems ( targets) must be hard, (look)small and be agile or employed in large numbers so as to overwhelm defenses which probably means small cheap drones. Soon everybody will be able to get some form of AI weapon probably sooner than we think.
 
  • Like
Likes Monsterboy
  • #87
gleem said:
Actually terrorist will use what ever they can get their hands on eg about six Abram M1 tank ended up in ISIS hands when the Iraqi army left them in retreat early in the ISIS assaults. I have stated earlier that autonomous drones are commercially available that have collision avoidance capability. About 80% of casualties in the Iraqi were where due to IED's by insurgents who surprised us by employing everything from cell phones to garage door openers. An Israeli autonomous radar seeking missile system HAROP was sold to Azerbaijan. A missile believed to be a HAROP destroyed a bus load of Armenians in a disputed territory between Armenia and Azerbaijan. Did the bus have radar? Was it re-engineered for a new purpose? How easy is it to re-engineer AI systems? Maybe not too hard. Anyway, never assume what your enemy is not capable of doing.

Today weapons systems have developed to the extent that if you can see it or know where it is you can probably hit it. We even have smart bullets and to shoot at something that is Km's away is like shooting from 100 ft. Potential autonomous AI systems ( targets) must be hard, (look)small and be agile or employed in large numbers so as to overwhelm defenses which probably means small cheap drones. Soon everybody will be able to get some form of AI weapon probably sooner than we think.
Okay, but this seems to disprove your point. 80% of causalities in Iraq were by improvised explosive devices. Believe it or not, you don't need to be a brainiac to make IEDs. Even I can make them, only except I won't because it's illegal in Japan and I don't need them.

For a built-from-scratch weapons to be worth using for the terrorists, it needs to be 1) easy to manufacture, 2) made with easily accessible parts, 3) cheap, and 4) practical for the purpose. Any weapon that does not meet this criteria will not be the priority for the terrorists. In this sense, Monsterboy is right that terrorists will likely not be able to improvise an autonomous military robots, and instead one should fear those robots being obtained from a military or secondary parties.As a side note, you should not get confused between a country and terrorist groups as they have significantly different levels in the financial/economic/military/man power. Azerbaijan is a country where their army was trained to use HAROP.

Also you should not get confused about the "purpose" of attacking Iraqi Army. You seem fast to interpret that ISIS attacked Iraqi Army to obtain the M1 tanks. This is a poor speculation. The much better explanation is that they just wanted to destroy or harass the military, and as a result, the tanks were left abandoned for them to get their hands on. Can they use the M1 Abrams tank? Maybe. But can they use them effectively? Very likely not. Heck, even the trained Iraqi couldn't properly handle the tanks. Why do you expect that the untrained terrorists can? M1 Abrams require tremendous amount of maintenance and you need have very good idea about what you are doing in order to operate them effectively.

Generally speaking, the more the technology become sophisticated, the more one needs good training to use them. Do you think if the terrorists by any chance get their hands on an Aegis equipped destroyer, they can use them against us? I know this is an extreme example, but this is the sort of scale in complexity we are talking about, and thus the discussion needs to be focused on the realistic aspect.
 
  • #88
HAYAO said:
Okay, but this seems to disprove your point. 80% of causalities in Iraq were by improvised explosive devices. Believe it or not, you don't need to be a brainiac to make IEDs. Even I can make them, only except I won't because it's illegal in Japan and I don't need them.

True, but the point is that they modified existing technology to serve a different purpose and it did require some significant technical knowledge. Do not underestimate your enemy.

HAYAO said:
For a built-from-scratch weapons to be worth using for the terrorists, it needs to be 1) easy to manufacture, 2) made with easily accessible parts, 3) cheap, and 4) practical for the purpose. Any weapon that does not meet this criteria will not be the priority for the terrorists. In this sense, Monsterboy is right that terrorists will likely not be able to improvise an autonomous military robots, and instead one should fear those robots being obtained from a military or secondary parties.

No not built from scratch but modified to suit their purpose, Commercial drones are cheap enough and have sufficient capability to be of use.
HAYAO said:
As a side note, you should not get confused between a country and terrorist groups as they have significantly different levels in the financial/economic/military/man power. Azerbaijan is a country where their army was trained to use HAROP.

Trained to use HAROP but not modify its capability.

HAYAO said:
As a side note, you should not get confused between a country and terrorist groups as they have significantly different levels in the financial/economic/military/man power. Azerbaijan is a country where their army was trained to use HAROP.

ISIS had(has) the financial/military and man power. If they need quick cash just kidnap a holder of significant bitcoins until he transfer them to the terrorist wallet and bingo . I has already occurred.

HAYAO said:
Also you should not get confused about the "purpose" of attacking Iraqi Army. You seem fast to interpret that ISIS attacked Iraqi Army to obtain the M1 tanks. This is a poor speculation. The much better explanation is that they just wanted to destroy or harass the military, and as a result, the tanks were left abandoned for them to get their hands on. Can they use the M1 Abrams tank? Maybe. But can they use them effectively? Very likely not. Heck, even the trained Iraqi couldn't properly handle the tanks. Why do you expect that the untrained terrorists can? M1 Abrams require tremendous amount of maintenance and you need have very good idea about what you are doing in order to operate them effectively.

The tanks were a gift to ISIS from a retreating army. They are useful until they are not. So what. Run it until it doesn't.

HAYAO said:
Generally speaking, the more the technology become sophisticated, the more one needs good training to use them. Do you think if the terrorists by any chance get their hands on an Aegis equipped destroyer, they can use them against us? I know this is an extreme example, but this is the sort of scale in complexity we are talking about, and thus the discussion needs to be focused on the realistic aspect.

No. They are not looking for those types of systems. They are looking for advanced IED's. Whatever they can cobble together and effectively deploy. An autonomous car bomb for example you don't need a martyr anymore which opens the door for less dedicated terrorists. Maybe not tomorrow but soon autonomous vehicles and aircraft will be available. When you are expecting your pizza delivery one night will it be a pizza?

The expertise is becoming sufficiently ubiquitous that relatively complex systems can be duplicated or created by ordinary citizens (engineers or makers) of the right political persuasion. Look at the proficiencies of hackers. Look at the criminal that produce all manners of electronic devices to defraud people. Look up
single board computers like Arduino or Raspberry Pi capable of AI functionality. Development is done commercially just tweak it for your purpose.
 
  • Like
Likes Monsterboy
  • #89
Very well said, @gleem -- Thread closed for Moderation. Hopefully we can figure out a way to keep the best parts of this discussion going.
 
  • Like
Likes Monsterboy and Stavros Kiri
  • #90
Okay, let's try re-opening this useful discussion, with a few ground rule reminders please:
  • Let's be careful not to introduce politics into the discussion, since that is something we have said that we cannot Moderate at the PF
  • Please keep in mind that we do not want to give any bad folks ideas that they have not thought of yet. We are a very strong intellectual bunch of folks, and some of the things that could get posted in this thread may be new ideas of how to use technology in terrorist activities (not with the intention of helping terrorists, but unintentionally posting something new).
  • And as always, please keep the discussion civil and positive.
Thanks! Thread re-opened.
 
  • Like
Likes HAYAO and Stavros Kiri
  • #91
Thank you berkeman, and I apologize for talking about politics and sometimes getting a bit overboard.Once again, I strongly stress that we should be talking about the realistic issues.

gleem said:
True, but the point is that they modified existing technology to serve a different purpose and it did require some significant technical knowledge. Do not underestimate your enemy.

gleem, I first have to apologize that I am somewhat restricted by rules with what I can say, so I cannot respond to every points you said here. I'll try my best though. Thus I am going to have to skip this point, but I am going to tell you that you are underestimating the easiness of improvising existing technology usable for IEDs. That is why they are being used as one of the main tactics by the terrorists and insurgents.
No not built from scratch but modified to suit their purpose, Commercial drones are cheap enough and have sufficient capability to be of use.
ISIS had(has) the financial/military and man power. If they need quick cash just kidnap a holder of significant bitcoins until he transfer them to the terrorist wallet and bingo . I has already occurred.
Then I assume that using the drones are the main tactics that the terrorists are using right now. No one has to die. No one needs technical knowledge. And they should be very very well funded so have no problem buying tons of those. How come they aren't mainstream tactics?

Trained to use HAROP but not modify its capability.
HAROP are also designed to be remotely controlled. Why do you need a bus to emit radio waves or radar wave?

I also still don't understand why you are talking about military of a country, when we were supposed to be talking about terrorists? Perhaps, I am misunderstanding you or something but were we talking about something else?
The tanks were a gift to ISIS from a retreating army. They are useful until they are not. So what. Run it until it doesn't.
You are underestimating the technicality of running and using a tank. I am saying they can't run it effectively. It is not useful to them at all.
No. They are not looking for those types of systems. They are looking for advanced IED's. Whatever they can cobble together and effectively deploy. An autonomous car bomb for example you don't need a martyr anymore which opens the door for less dedicated terrorists. Maybe not tomorrow but soon autonomous vehicles and aircraft will be available. When you are expecting your pizza delivery one night will it be a pizza?
Making an improvised bomb is somewhat easy. I only need myself to do it and deliver it to some destination. Or better yet, I can make a lot of IEDs and have my followers (don't have any) deliver it somewhere, which is one of the main tactics used by the terrorists today. However, as soon as the technicality of the issue increases, all of the sudden things becomes more impractical. I need an auto-driving car that I have to steal or buy, which is going to be certainly much more expensive than conventional vehicles. Most of the suicide bombing by cars are done with really old and cheap cars that can be easily bought, rented, or stolen. No one uses Porsche for suicide bombing, even though they might be good for running away from police if they are found out. Sure it can be done, and maybe we'll see it happen once or twice, but it's very unwise to do so especially from the perspective of an actual large-scale terrorist organization. So then why not just stick to a more simpler but an effective solution?

If you are talking about possibilities, then I will say that nothing is impossible. But whether that is relevant to how one actually use that to a large-scale common tactics in an effective way is a completely different discussion. Personally, I wouldn't be concerned about autonomous robotics no more than cellphones.
 
Last edited:
  • #92
gleem said:
...six Abram M1 tank ended up in ISIS hands ... An Israeli autonomous radar seeking missile system HAROP was sold to Azerbaijan. A missile believed to be a HAROP destroyed a bus load of Armenians in a disputed territory between Armenia and Azerbaijan.
I tends to recognize both case as warfare, not terrorism. I do know that ISIS has the well deserved title of terrorist organization, but when some neighboring countries has similar standards, then I think it's better to focus on the actual events involved.

Similar events are pretty common in that area. However, what's common is that since military hardware is hard to get, it is commonly used against other military hardware/targets. Closest case to the discussed 'military hardware against civilians' scenario might be some bombings of hospitals in Syria, but that is quite hard and inconvenient to be easily categorized.

gleem said:
They are not looking for those types of systems. They are looking for advanced IED's. Whatever they can cobble together and effectively deploy. An autonomous car bomb for example you don't need a martyr anymore which opens the door for less dedicated terrorists. Maybe not tomorrow but soon autonomous vehicles and aircraft will be available. When you are expecting your pizza delivery one night will it be a pizza?

The expertise is becoming sufficiently ubiquitous that relatively complex systems can be duplicated or created by ordinary citizens (engineers or makers) of the right political persuasion. Look at the proficiencies of hackers. Look at the criminal that produce all manners of electronic devices to defraud people. Look up
single board computers like Arduino or Raspberry Pi capable of AI functionality. Development is done commercially just tweak it for your purpose.

Yes. And I find it quite disturbing that we expected to discuss the usage of second grade of nonexistent military hardware for terrorism in a manner that a Bruce Willis might pop up any time, when we might be just a few years away from accidents (or worse) caused by misused drones and such.
 
Last edited:
  • #93
HAYAO said:
Personally, I wouldn't be concerned about autonomous robotics no more than cellphones.
Cellphones can potentially be monitored by the carrier etc. ...
 
  • #94
Stavros Kiri said:
Cellphones can potentially be monitored by the carrier etc. ...
I know, that's the point.
 
  • #95
HAYAO said:
gleem, I first have to apologize that I am somewhat restricted by rules with what I can say, so I cannot respond to every points you said here. I'll try my best though. Thus I am going to have to skip this point, but I am going to tell you that you are underestimating the easiness of improvising existing technology usable for IEDs. That is why they are being used as one of the main tactics by the terrorists and insurgents.

Can you tell me exactly what rule(s) you are going to break by responding to this particular point @gleem made ?
 
  • #96
Monsterboy said:
Can you tell me exactly what rule(s) you are going to break by responding to this particular point @gleem made ?
I was about to talk about why or how "special" and "technical" knowledge is not required to make IEDs, which may possibly be against the request from berkeman. But the point was important because it gives you good idea on the difference in the level of sophistication, technicality, and the cost between the high-tech IEDs gleem was concerned about and what is actually being used.
berkeman said:
Please keep in mind that we do not want to give any bad folks ideas that they have not thought of yet. We are a very strong intellectual bunch of folks, and some of the things that could get posted in this thread may be new ideas of how to use technology in terrorist activities (not with the intention of helping terrorists, but unintentionally posting something new).
 
  • #97
HAYAO said:
I was about to talk about why or how "special" and "technical" knowledge is not required to make IEDs, which may possibly be against the request from berkeman. But the point was important because it gives you good idea on the difference in the level of sophistication, technicality, and the cost between the high-tech IEDs gleem was concerned about and what is actually being used.

But...but ...you said
HAYAO said:
... but I am going to tell you that you are underestimating the easiness of improvising existing technology usable for IEDs.
 
  • #98
Monsterboy said:
But...but ...you said
I said the same thing. Please read carefully.
 
  • #99
HAYAO said:
I said the same thing. Please read carefully.

Hmm..don't you think the level of sophistication depends on necessity ? terrorists don't need (but don't mind if they get it for free) IEDs as advanced as government ones right ? because they serve a different purpose.

https://worldview.stratfor.com/article/role-improvised-explosive-devices-terrorism
Not only can improvised explosive devices come in a number of different shapes, they can also be designed to serve different functions.

When operating against robust security and in a hostile environment, innovation and imagination become critical traits for a bombmaker to be successful. Since the beginning of terrorism, there has been a constant arms race between terrorist planners and security forces. Every time security is changed to adapt to a particular threat, the terrorist planner must come up with a new attack plan (often involving a new type of improvised explosive device) to defeat the enhanced security measures.
 
  • #100
Monsterboy said:
Hmm..don't you think the level of sophistication depends on necessity ? terrorists don't need (but don't mind if they get it for free) IEDs as advanced as government ones right ? because they serve a different purpose.

Yes, I agree.
 

Similar threads

Replies
2
Views
107
Replies
49
Views
7K
Replies
84
Views
8K
Replies
4
Views
2K
Replies
1
Views
10K
Replies
7
Views
2K
2
Replies
96
Views
9K
Back
Top