DISEC 2026 Update Brief: Lethal Autonomous Weapons Systems

Introduction 

The world of global policy is constantly changing. These changes make it important for everyone to have the most recent information. Present conflicts around the world utilize Lethal Autonomous Weapons Systems (LAWS). Understanding the growth of and changes in the usage of LAWS will be critical for any debate. Additionally, new UN plans for the restriction of LAWS have been created. These plans could mean limiting these systems.

 

Use of LAWS in Current and Developing Global Conflicts

As global conflicts continue to expand, so does the use of AI-enabled weapons systems and increasingly autonomous military technologies. As of 2025, the use of AI has reportedly become integral in Israel’s military operations. In November 2025, an Israeli drone allegedly attacked Khan Younis, an area located in the southern Gaza Strip. This attack killed two Palestinians. One was a child. Israeli forces claim that this attack was because the Palestinians posed an immediate threat. As of early January 2026, there is a fragile ceasefire in the conflict. However, the presence of LAWS has continued to be a threat. AI provides high-speed, data-driven targeting capabilities, but it may reduce meaningful human supervision. This is a concept central to ongoing UN debates on lethal autonomy. Thus, regulations become difficult to implement. AI-driven weapons can also act faster than human decision-makers, increasing the risk of accidental escalation. Once deployed, these systems can also be replicated by non-state actors, especially as AI tools become cheaper and more accessible.

 

In the Russia-Ukraine war, both sides have used AI-enabled weapons systems in their attacks. In October 2025, a captured Russian drone was reportedly capable of autonomous target selection. This drone is not the first, with many drones already using AI in one way or another. Ukraine’s deputy defense minister, Yuriy Myronenko, stated that “Our military gets more than 50,000 video streams every month, which are analyzed by artificial intelligence.” While much AI use supports intelligence, similar technologies can be adapted for lethal decision-making. With these videos, militaries can collect more information behind enemy lines and analyze which tactics could best work. The small drones are also difficult to shoot down, and their signals can be harder to jam or disrupt than remotely piloted systems.

 

Although the use of AI has helped save lives, it has also helped make AI-enabled weapons systems more lethal on the battlefield. In early November 2025, Kharkiv, Ukraine, was attacked with a Russian drone strike. Multiple citizens were wounded, and homes were destroyed. Legal experts have raised concerns that such strikes may violate international humanitarian law. However, the use of AI makes assigning responsibility far harder. As drones continue to develop with AI, they risk undermining the rule of law. The mandate of DISEC is to uphold peace and help regulate the rules of warfare to protect human life. When AI systems influence targeting decisions, assigning legal responsibility between commanders, operators, and developers becomes increasingly complex. As more states deploy AI-enabled weapons, others may feel pressured to follow, increasing the risk of a rapid and unregulated global arms race.

 

Recent UN Regulatory Developments and Expert Session Meetings 

Debate over LAWS reflects divergent views on whether and how these systems should be regulated. On November 6, 2025, the United Nations General Assembly passed Resolution A/C.1/80/L.41. There were 156 countries in favor, five against, and eight abstaining. The vote clearly shows most states are ready to act. The resolution addressed growing fears about the unchecked use of LAWS. It also highlighted potential risks to International Humanitarian Law (IHL) and emphasized the need for a multidisciplinary approach. This covers legal, human welfare, tech-based, and safety perspectives. However, it does not establish a legally binding framework or mandate formal treaty negotiations. Both are necessary for global security. As a General Assembly resolution, it cannot create binding international law, since only treaties negotiated between states or Security Council decisions can carry legal obligations.

 

This resolution also praises the work done by the Group of Governmental Experts on Lethal Autonomous Weapons Systems (GGE LAWS). The Convention on Conventional Weapons (CCW) created the GGE LAWS in 2016. This group was established to examine possible regulatory frameworks, legal principles, and risk mitigation measures related to LAWS. In September 2025, the experts met, and Brazil released a joint statement on behalf of 42 states. These states include Austria, Germany, Sierra Leone, Mexico, Chile, Thailand, and more.

 

The statement demonstrated that a growing coalition of states supports the development of a legal framework. This is important because it sets intent and urgency. In 2024, the GGE LAWS created an initial paper to control LAWS. This working paper contributes to ongoing discussions on the possible elements of a future international agreement. Much of this paper draws from current IHL and the risk LAWS pose to it. For example, it supports prohibiting or restricting systems that cannot reliably comply with IHL principles, such as distinction and proportionality.

 

It is crucial to consider IHL to protect innocent civilians. The text also requires testing for LAWS. Tests that simulate war settings would help stop targeting errors. It is also important to reduce bias in AI systems. Bias in AI training data could increase the risk of civilians being misidentified as combatants. This working paper helps set the foundation for a formally proposed agreement that brings disarmament closer. Understanding these concerns will help DISEC delegates craft balanced proposals that protect civilians while maintaining international security.

 

Conclusion 

The debate on LAWS is rapidly changing. Their use in conflict zones, such as the Russia-Ukraine war and the Israeli-Palestinian conflict, has proved their danger. This has made new rules necessary for the future. DISEC’s goal is to foster global peace and disarmament wherever possible. As these tools continue to expand and become more dangerous, they pose a threat to these goals. A central issue in UN discussions is whether lethal decisions must always remain under meaningful human control, a standard that many states argue AI-enabled weapons increasingly undermine. With these concerns in mind, delegates will need to act to find common ground in controlling LAWS. Recent resolutions and the working paper give the committee a strong place to start.

 

Bibliography

  1. “156 States Support UNGA Resolution on Autonomous Weapons,” Stop Killer Robots, accessed November 15, 2025, www.stopkillerrobots.org/news/156-states-support-unga-resolution/.
  2. Abdujalil Abdurasulov, “The New AI Arms Race Changing the War in Ukraine,” October 10, 2025, www.bbc.com/news/articles/cly7jrez2jno.
  3. “Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects,” United Nations Office for Disarmament Affairs, November 8, 2019, unoda-documents-library.s3.amazonaws.com/Convention_on_Certain_Conventional_Weapons_-_Group_of_Governmental_Experts_(2019)/1919338E.pdf.
  4. Magee Caoláne et al., “Updates: Child among Two Killed as Israeli Air Raids, Drones Target Gaza.” Al Jazeera, November 10, 2025, www.aljazeera.com/news/liveblog/2025/11/10/live-israeli-air-raids-hit-gaza-city-southern-lebanon
  5. “Revised Rolling Text on LAWS from GGE,” United Nations Office for Disarmament Affairs, May 12, 2025, docs-library.unoda.org/Convention_on_Certain_Conventional_Weapons_-Group_of_Governmental_Experts_on_Lethal_Autonomous_Weapons_Systems_(2025)/Revised_rolling_text_as_of_12_May_2025.pdf.
  6. Sarah Fathallah, “Artificial Intelligence and the Orchestration of Palestinian Life and Death.” Tech Policy Press, August 12, 2025, www.techpolicy.press/artificial-intelligence-and-the-orchestration-of-palestinian-life-and-death/.
  7. “September 2025 GGE Joint Statement,” Stop Killer Robots, September 8, 2025, stopkillerrobots.org/news/september-2025-gge-joint-statement/.
  8. “Ukraine’s Kharkiv Hit by Deadly Russian Drone Attacks,” France 24, November 24, 2025, www.france24.com/en/video/20251124-ukraine-s-kharkiv-hit-by-deadly-russian-drone-attacks.
  9. United Nations General Assembly, Lethal autonomous weapons systems, Resolution A/C.1/80/L.41, (Oct. 14, 2025), reachingcriticalwill.org/images/documents/Disarmament-fora/1com/1com25/resolutions/L41_corrected.pdf.

Share this post