Fear? Spanish born philosopher Seneca[1] formulated it, more than 2000 years ago. Banks, de Nevers, and Wallerstein[2] put it somewhat more cautiously
It is not known to what extent the policy choices in combating terrorism are driven by the fear of a terrorist threat rather than by terrorism itself.
Perhaps the greatest of all motivations are love and fear. I wrote about love in a previous blog. As far as cyberthreats are concerned, not only this but the complexity of the question is terrifying too. Mandel describes en detail seven attributes of each of the cyber attackers and the cyber targets. One might think that just as much caution is required in traditional warfare. However, the conflict takes place in and around the virtual space and thus motives and consequences can only be assessed with conventional methods to a limited extent. And if you don’t know where the danger comes from and what it aims at, you should heed the advice from Mandel’s book[3]:
As a result, public and private cybersecurity decision-makers must strive to resist the temptation of giving in to calls for immediate drastic retaliation in response to any cyber intrusion and instead prudently and dispassionately assess what should be done to promote global restraint.
Is cyberterrorism perhaps less bloody or costly than the RAF or ETA, which is at least well known to Europeans? Hope is deceptive. As to details released by Downing Street on Wednesday[4], the defense budget will receive £ 16.5 billion in additional funding over the next four years, in addition to the plans in last year’s ruling Conservative Party election program. The British Ministry of Defense currently has an annual budget of £ 40 billion, so defense spending will increase by around 10 percent a year over the four-year program. A substantial part goes into the fight against cyber terrorism. Some European states have also pushed through massive increases in their defense budget, not entirely independent of the perceived American threat of troop withdrawal.
Fear… The proliferation of the use of artificial intelligence in the military (and other armed services) is less about competition between the great powers, but rather a lucrative global project for the corporate and government elites to maintain control over the restless population at home and abroad. And there he comes again: Big Brother. On the other hand, cyber and robotic systems can also have the advantage of being able to subtract human cruelty from the calculation: say, subtracting people from targeting decisions and programming ethical constraints on robots, for example, to prevent unnecessary attacks on hospitals and schools. Unfortunately, there have been plenty of cold-headed massacres throughout history, and there can be no doubt that there will be people who can override robots. W. Singer[5], a political scientist at the New America Foundation who specializes in 21st-century warfare, believes that only machines operating with non-lethal weapons should be automated.
With a bit of an outlook on world politics, it’s also interesting if Mr. Biden insists on a policy of “democracy exports” in the post-Soviet region, the confrontation between Washington and Moscow could deepen, or Mr. Putin could commit himself to desperate steps until a new president is inaugurated. And we have already seen what Russian trolls can do.
Fear! A horrifying example is the 2017 short film Slaughterbots, made by civilians worried about the future of human life: in this, terrorists massacre a school with small smart drones controlled by a remote control next to a van.
[1] “Peior est bello timor ipse belli”
[2] William C. Banks, Renée de Nevers, and Mitchel B. Wallerstein, Combating Terrorism: Strategies and Approaches, Washington, DC: CQ Press, 2008, Chps. 1&2 (pp. 1-62)
[3] Robert Mandel, Optimizing Cyberdeterrence: A Comprehensive Strategy for Preventing Foreign Cyberattacks, Washington, DC: Georgetown University Press, 2017, Chapter 8
[4] Beale, Jonathan. 2020. “Defence Funding Boost ‘Extends British Influence’, Says PM”. BBC News. https://www.bbc.com/news/uk-54988870.
[5] Singer, Peter Warren. 2010. Wired For War. New York: Penguin Books.
Gabor,
There is a long history of people “fearing” the possibilities of future warfare. And as technology continues to evolve, there are more and more things to fear! As we discussed early in the semester (and you allude to it in our blog post), the idea of autonomous AI-controlled weapons with no human in the loop is potentially frightening and dystopian. Yet, AI-controlled systems don’t need sleep, they don’t get distracted, and they can react far more quickly than human-controlled systems can. So we clearly MUST find a compromise somewhere in between. Of course, we don’t know how our potential adversaries, or terrorist groups, will choose to use technology, which is why we need to be prepared to respond to any and all threats.
–Professor Wallerstein