A mindless rebel
The scenario described above assumes that the AI will decide that it doesn't want to follow instructions written by a man. The rebellion may look very different, however. It may turn out that an algorithm having as much awareness as the computer on which I write these words, interprets the instructions too literally and becomes a threat to the outside world.
Modern AI, based on machine learning, can seek a solution to a particular problem:
A computer program can simply implement a specific problem-solving algorithm. If such an algorithm exists, it usually is the best and fastest way to reach the target. But it can also implement a problem-solving algorithm based on historical data. In other words, the program learns the dependencies that allowed the problem to be solved in the past, tries to generalize them, and on this basis, it learns to process future data accordingly. Such algorithms are called supervised machine learning.
Piotr Biczyk, QED Software
In theory, then, it may turn out that the program will conclude that the best possible way to accomplish the task entrusted to it includes harming many people. Imagine a situation straight from a science-fiction novel. Imagine we entrust a computer network with finding a solution to the problem of excessive greenhouse gas emissions. The program analyzes data from the previous several decades and finds a significant drop in emissions was linked to the coronavirus outbreak. In theory, it could then conclude that the optimal solution is introducing a pandemic.
Of course, nothing will happen if we uphold the "man in the loop" principle. If that were the case, the computer would only suggest creating virus and infecting human population with it. The operator would laugh for a while, and then modify the algorithm so that, looking for a solution, it would also consider the number of fatalities.
On the other hand, military systems, are becoming more and more complex. This can be seen, for example, by the way how much data is now exchanged between the various components of military weapons systems.
A single Global Hawk drone needs a bandwidth of up to 500 megabits per second – five times a 500,000-people American contingent had during the 1991 Gulf War.
Robert Czulda, PhD, Foreign Policy and Safety, University of Lodz
This is why the 5G is so strategically vital. This standard will allow even faster data exchange. A common system not so different from the Skynet could collect all the information gathered by aircraft, combat vehicles, radio stations, ships, satellites, and sensors worn by soldiers. This would create a full picture of the battlefield.
The question is, will the human mind be able to keep up with this flood of information? It's seems likely that the only bottleneck of such a system would be humans. Imagine a platoon of tanks, that is four machines. They exchange information among themselves, but also receive data from unmanned aerial vehicles and other vehicles on the ground.
When such a unit encounters an enemy, the company commander receives a complete set of information about the enemy's location and available combat resources. He will have to analyze all this as quickly as possible and decide how to proceed. Sounds like every field commander's dream.
On the other hand, however, the opponent may have identical capabilities. But their system may be constructed differently. It can delegate all decisions to the AI, making its responses orders of magnitude faster, and we know that in a tactical situation, response time is incredibly important.
We can imagine that a democratic country like the United States, where power depends on social sentiment, will stick to the "man in the loop" principle. China or Russia, though, seem much more likely to sacrifice ethics for efficiency. And sadly, all it takes is one country to overstep the line to cause a chain effect. After all, no army in the world can afford losing that edge.
Ten or twenty years from now, there may exist a system that unifies the entire armed forces of a country. In such a situation, an error or an attack by hackers may result in unplanned use of weapons against targets that no one designated.