Countries will meet on Monday in the United Nations to revive efforts to regulate the types of autonomous weapons controlled by AI that are increasingly used in modern war, while experts warn that the time ends to give limits to new lethal technologies.
Autonomous weapons systems assisted by artificial intelligence already play an increasingly important role in conflicts from Ukraine to Gaza. And the increase in defense spending worldwide promises to further boost the growing military technology assisted by AI.
However, progress towards the establishment of global standards that govern their development and use has not followed the same rhythm. And binding international standards remain practically non -existent.
Since 2014, countries that are part of the Convention on Conventional Weapons (CCW) have gathered in Geneva to discuss a possible prohibition of totally autonomous systems that work without significant and regular human control.
The UN Secretary General, Antonio Guterres, has set the year 2026 as a deadline for states to establish clear rules on the use of weapons of AI. However, human rights groups warn that there is a consensus among governments.
Lee: Gaza population faces the critical risk of famine, alerts the UN
Alexander Kmentt, Chief of Weapons Control of the Ministry of Foreign Affairs of Austria, said this must change rapidly.
“The time is really running out to put some barriers so that the nightmare scenarios on which some of the most prominent experts warn are not come true,” he said.
Monday’s meeting of the UN General Assembly in New York will be the first meeting of the agency dedicated to autonomous weapons. Although they are not legally binding, diplomatic officials want the consultations to increase the pressure on the military powers that resist regulation due to the concern that the rules could reduce the advantages of technology on the battlefield.
The campaign groups expect the meeting, which will also address critical issues not covered by the CCW, including ethical and human rights concerns and the use of autonomous weapons by non -state actors, drives states to agree on a legal instrument.
They consider it a decisive test to determine if the countries are able to overcome divisions before the next round of negotiations on the CCAC in September.
“This issue requires clarification through a legally binding treaty. Technology advances very fast,” said Patrick Wilcken, an international amnesty researcher on military, security and police affairs.
“The idea that you don’t want to rule out the delegation of life or death decisions … a machine seems extraordinary.”
Armament career
New York conversations are produced after 164 states supported a 2023 general UN resolution that asks the international community to urgently address the risks raised by autonomous weapons.
While many countries support a binding global framework, the United States, Russia, China and India prefer existing national guidelines or international laws, according to Amnesty.
“We are not convinced that current legislation is insufficient,” Reuters of the American Pentagon spokesman told the agency. Adding that autonomous weapons could actually assume a lower risk for civilians than conventional weapons.
The governments of India, Russia and China did not respond to the requests for comments.
In the absence of regulation, autonomous systems are proliferating. The weapons experts of the Future of Life Institute Expert Group have tracked the deployment of approximately 200 autonomous weapons systems in Ukraine, the Middle East and Africa.
See: This is how Trump’s chaotic commercial war has evolved
Russian forces, for example, have deployed about 3,000 Kamikaze Veter drones, capable of detecting and attacking objectives autonomously, according to their data. Meanwhile, Ukraine has used semi -autonomous drones in the conflict.
The Ukrainian government declined to comment.
Israel has used artificial intelligence systems to identify objectives in Gaza. His mission in Geneva said he supported multilateral conversations and used data technologies in full compliance with international law.
However, Human Rights Watch said that crucial accountability issues under international law are still not resolved and warned in a report last month that unregulated autonomous weapons have a series of threats to human rights and could cause an arms race if they are not controlled.
Meanwhile, activists such as Laura Nolan, from Stop Killer Robots, are concerned that there are currently few measures that guarantee that defense companies will develop weapons driven by AI responsible.
“We generally do not trust industries to self -regulate … There is no reason for defense or technology companies to be more worthy of trust,” he said.
With Reuters information
Follow us on Google News to always keep you informed