The 21 Century has witness a paradigm shift in the advancement of technologies in almost all the industries. And in the warfare, technology is changing not just how we engage with our enemies but also the fronts on which we will engage them. It’s also potentially promising to help bring about world peace.

Unmanned Combat Aerial Vehicles

Drones and AI are a lethal combination in the future warfare. Autonomous stealth drones such as the Taranis produced by BAE Systems change the way traditional wars are fought. The Taranis can fly to a preselected area on a programmed flight path, identify a threat, target that threat, and alert a human operator that it has identified a target. The human operator reviews the information, approves the attack and the drone fires a missile and destroys the target before flying itself back home.

As we have all witnessed in the last two decades, Drones are already in wide military use conducting surveillance and even attacking hostile targets. Drones can spot submarines and mines, and even deliver humanitarian aid to places where aid convoys cannot go. They are smaller, stealthier, and safer for the troops employing them. The possibility that the drone could be spotted from the ground is slimmer, and when it is, the pilot is safely back at base, operating the vehicle from a computer station.

Of course, not everyone is thrilled with the idea of autonomous machines capable of killing and destroying. More than 17,000 researchers and many global thought leaders have signed an open letter to the United Nations calling for the body to ban the creation of autonomous and semi-autonomous weapons, including Tesla and SpaceX CEO Elon Musk, physicist Stephen Hawking, and Google’s director of research Peter Norvig.

The main problem these scientists see with autonomous weapons is that it’s often unclear where and how human oversight will be included in the process, and what rules will govern the autonomous weapon’s program. A human operator might consider the proximity of a school or hospital before targeting, or take Geneva conventions into account before approving a target. A computer might not.

The NextGen Technology

Apart from the ethical challenges surrounding weapons equipped with AI, big data and technology will challenge what we perceive as “war.” Future wars may not be fought on physical battlefields, but digital ones. One of the greatest fears of our governments is the scenario in which a regime, terrorist group or in fact anyone targets the networks themselves. As more and more systems and infrastructure are built to be “smart” and include Internet of Things connectivity, the more we are putting those systems and infrastructure at risk. Imagine the chaos that would be caused by disrupting or disabling wireless communications or internet connectivity. Now imagine that it also disrupts your electricity, your water, the traffic lights, the emergency services — because all of those systems rely on data and networks. The scenario quickly becomes a nightmare. We’ve already seen examples of governments hacking other governments, like the allegations that a foreign government was behind a massive data security breach of the Democratic party earlier this year. And security analysts say that our governments are woefully ill equipped to meet sophisticated hackers on this battlefield. It’s not hard to imagine that wars in the future could be won by hackers and that the soldiers or fighter pilots of today will be data scientists and autonomous machine learning algorithms tomorrow.

Big Data Management

Lets see the positive side of it. While this all sounds vaguely apocalyptic and terrifying, there’s also a bright side to all this data and technology. Organizations including the U.S. Defense Department, the United Nations, Russian Defence, Chinese Intelligence Agencies and many other global defenses including India have all launched big data initiatives in the past few years, the goals of which are to predict and anticipate “political crises, disease outbreaks, economic instability, resource shortages, and natural disasters." The programs use the vast quantities of unstructured data from media reports, blog posts, social media posts, and more to try to anticipate events, plan interventions, and assess what worked and what didn’t. Using this data to prevent problems and promote peace may still be a long way off, but because these are learning algorithms, every bit of data they receive and every prediction they make brings them closer to their goals. Just as the CDC and Google are hoping to predict the spread of flu season via analysis of unstructured social data, so these agencies are hoping to predict — and prevent — the spread of unrest and violence. It seems a noble and worthy goal, trying to use our technologies to predict our way to peace rather than fight more terrible wars.

recommend to friends
  • gplus
  • pinterest