Are Artificially Intelligent Military Systems Worth the Risk?

Military systems have to tackle hard questions and be prepared, so they try to use all possible up-to-date resources. Nowadays, they pay attention to the AI development and autonomous systems. Military technology has already used autonomous weapons.

Benefits

Implementing artificial intelligence weapon systems has many concerns: practice and ethical. Remote-managed vehicles considered as dangerous for people. But autonomous arsenal may decrease the human workforce needed to perform tasks. These systems may do the appropriate work for longer durations and more surely. They will provide the task without injury or bore. In extreme conditions, the autonomous systems will analyze the situation and respond more dispassionately.

When decisions have to be made fast considering large amounts of data, AI may provide decisive assistance. Artificial intelligence may be a tool in the modern military with its supply systems and training simulation exercises.

The US Army plans to use autonomous systems that will improve protection. More than 30 advanced militaries already use automated weapons. The US Air Force is creating autonomous systems that gather and analyze data and develop intelligence.

Some scientists say that automated machines can’t harm, others convinced that the discussion of ethical autonomous weapons is a red herring.

Concerns

“There are sentry robots in Korea, in the Demilitarized Zone,” says Stuart Russell, the director of the Center for Intelligent Systems at the University of California. “And those sentry robots can spot and track a human being for a distance of 2 miles — and can very accurately kill that person with a high-powered rifle.”

Currently, those AI systems have two regimes. The first one demands a human’s approbation before it kills, but in automatic mode, it will kill by itself. Also, scientists worry that autonomous weapons will have a widespread and cheap for many countries to mass using.

Many achievements may be interpreted as concerns as well. War is very expensive in financial and human cost. When new technology reduces the human effort in war, it will become an easier option for many countries. Moreover, there are worries about military responsibility. Artificial intelligence is outside of the military justice system. For example, in Iraq, there has been an experience where the wounded person was assessed as a danger. The differences available to determine something as dangerous by AI are not always easy. While a human will assess target more clearly.

“Two of the major problems,” Murray Shanahan, Professor of Cognitive Robotics at Imperial College London, explained, “are endowing computers and robots with a common sense understanding of the everyday world and endowing them with creativity. By creativity I don’t mean the sort of thing we see in the Picassos or Einsteins of the world, but rather the sort of thing that every child is capable of.”

Scientists ensure that bots self-awareness is developing and our brain is so complex mechanism that it requires a huge contribution to simulate human awareness. They match nuclear and AI weapons because 2010–2020 is likely to 1940–1950.

To reduce unwanted results, the autonomous weapon may be developed with the opportunity to enable humans in judgment process to assess the situation. Also, these systems can’t target people.

Implementing artificial intelligence weapon systems has many concerns: practice and ethical. Remote-managed vehicles considered as dangerous for people. But autonomous arsenal may decrease the human workforce needed to perform tasks. These systems may do the appropriate work for longer durations and more surely. They will provide the task without injury or bore. In extreme conditions, the autonomous systems will analyze the situation and respond more dispassionately.

Many achievements may be interpreted as concerns as well. War is very expensive in financial and human cost. When new technology reduces the human effort in war, it will become an easier option for many countries. Moreover, there are worries about military responsibility. Artificial intelligence is outside of the military justice system. For example, in Iraq, there has been an experience where the wounded person was assessed as a danger. The differences available to determine something as dangerous by AI are not always easy. While a human will assess target more clearly.

©Itsquiz — Be competent!