Ramesh Jaura, a journalist specializing in globalization, nuclear disarmament and culture of peace, conducted an interview with SGI President Ikeda on the topics of nuclear abolition and lethal autonomous weapons systems. The following is an excerpt.
Ramesh Jaura: While nuclear weapons are being modernized, lethal autonomous weapons systems (LAWS) are beginning to pose a grave threat to international peace and security. What do you think can be done?
SGI President Ikeda: LAWS, also called Artificial Intelligence (AI) weapons or robot weapons, are under development in several countries but have not yet been deployed.
An international framework must be created to ban their development or deployment before any atrocity takes place. I have been warning of the threat they present from a humanitarian and ethical perspective because these weapons, when given a command to attack, automatically go on killing with no hesitation or pangs of conscience.
The Campaign to Stop Killer Robots, a civil society coalition of which the SGI became a member in 2018, is working to ban the development and use of LAWS. Concern over the security and militaristic consequences of these weapons is growing in the international community. If any country were to deploy them for military use, the impact would be equivalent to that of the advent of nuclear weapons and radically transform the global security environment …
Most states seemed to agree on the crucial importance of “ensuring appropriate levels of human judgment in decisions to use force” despite their differing views on prohibition. It should also be noted that Japan, which has repeatedly stated it has no plan to develop LAWS, highlighted the concerns of civil society regarding these weapons.
On the other hand, states reluctant to prohibit LAWS argued that technological advances in precision targeting would reduce civilian casualties in the event of use of such weapons. I cannot help but perceive the same kind of mentality in their argument as that which seeks to develop “clean” and “smart” nuclear weapons. The fundamental premise must be that assuming a distinction between “good” LAWS and “bad” LAWS will have serious consequences in the light of the spirit of International Humanitarian Law.
Jaura: Specifically, what do you see as the most dangerous aspect of LAWS?
President Ikeda: As I stated in my peace proposal, LAWS would create not only a physical disconnect—the situation in which those who direct attacks and those who are targeted are not in the same place, as already seen in the case of drone strikes—but also an ethical disconnect, completely isolating the initiator of the attack from the actual combat operation. This blatantly goes against human dignity and the right to life, principles established in the international community that are rooted in the lessons of two world wars and numerous tragedies of the last century. I cannot emphasize enough that we must not overlook the ethical disconnect inherent in LAWS.
If LAWS were to be used in actual combat, would there be any room for deep remorse over one’s actions, which must be felt by many of those who have engaged in combat, a poignant sense of powerlessness in the face of war or a personal resolution to dedicate oneself to peace for the sake of future generations?
In a world of AI-controlled weapons systems, there would be no chance of the complicated feelings that cross the lines of friend and foe arising, nor the weight of humanity bearing down … Would it then be possible to hold off, even for a moment, the decision to attack?
Fully autonomous robotic weapons would lower the threshold for military action. This could not only inflict catastrophic damage but also drastically limit possibilities for post-conflict reconciliation between former enemies. While they would be different in nature from nuclear weapons, any use of fully autonomous weapons would have irreversible consequences for both the country using them and the country they are used against.
Therefore, I strongly urge all parties to come together to work for the early adoption of a legally binding instrument comprehensively prohibiting the development and use of LAWS. Some argue that it is not easy to create a framework to ban weapons that are still in the development stage and yet to be deployed. But there is a precedent—blinding laser weapons were prohibited by a CCW protocol prior to deployment.
With keen awareness of the true nature of fully automated weapons, the SGI would like to continue working tenaciously to build international opinion calling for the prohibition of the development and use of LAWS.
To read the full interview, visit www.indepthnews.net, and select “opinion”.
You are reading {{ meterCount }} of {{ meterMax }} free premium articles