“This technology is our future threat,” warns Serhiy Beskrestnov as he examines a newly captured Russian drone. It is unlike anything he has seen. Controlled by artificial intelligence, it can find and strike targets without human intervention.
Beskrestnov, a consultant for Ukraine’s defence forces, has analysed countless drones since the conflict began. But this one stands out. It neither sends nor receives signals, making it impossible to jam or trace.
Both Russian and Ukrainian forces now test AI in combat. They use it to locate enemies, analyse intelligence, and clear mines faster than ever.
Artificial intelligence becomes a battlefield game-changer
For Ukraine’s military, AI has become essential. “Our forces receive over 50,000 video streams from the front every month,” says Deputy Defence Minister Yuriy Myronenko. “Artificial intelligence analyses the footage, identifies threats, and maps targets.”
AI helps commanders make faster decisions, use resources efficiently, and reduce casualties. Its biggest impact comes in unmanned systems. Ukrainian troops now operate drones that lock onto targets and fly autonomously for the final stretch of a mission.
These drones cannot be jammed and are hard to shoot down. Experts predict they will soon evolve into fully autonomous weapons capable of finding and destroying targets on their own.
Drones that strike independently
“All a soldier needs to do is press a button on a smartphone,” explains Yaroslav Azhnyuk, CEO of the Ukrainian tech firm The Fourth Law. “The drone will find its target, release explosives, assess the damage, and return to base. No piloting skills are required.”
Azhnyuk says these drones could strengthen Ukraine’s air defences against Russian long-range drones such as the Shaheds. “A computer-guided system can outperform humans,” he notes. “It reacts faster, sees better, and moves more precisely.”
Myronenko admits full autonomy is still in development but says Ukraine is close. “We have partly integrated it into some devices,” he confirms. Azhnyuk predicts thousands of these systems could be active by the end of 2026.
Balancing progress with risk
Ukrainian developers remain cautious about fully automated weapons. “AI might not tell a Ukrainian soldier from a Russian one,” warns Vadym, a defence engineer who requested anonymity. “Their uniforms often look the same.”
Vadym’s company, DevDroid, makes remotely controlled machine guns that use AI to detect and track people. They have disabled automatic firing to prevent friendly fire. “We could enable it,” he says, “but we need more experience and feedback from troops to ensure safety.”
Ethical and legal concerns remain. Can AI obey the laws of war? Will it recognise civilians or soldiers surrendering? Myronenko believes humans must make the final call, even if AI assists. Yet he acknowledges that not every force will act responsibly.
The global AI arms race
AI is creating a new and unpredictable arms race. Traditional defences—jamming, missiles, or tanks—struggle against swarms of intelligent drones.
Ukraine’s “Spider Web” operation last June, when 100 drones hit Russian air bases, reportedly used AI coordination. Many fear Russia could replicate these tactics, both at the front and deep inside Ukraine.
President Volodymyr Zelensky told the United Nations that AI is driving “the most destructive arms race in human history.” He called for urgent global rules to govern AI weapons, saying the challenge is “as urgent as preventing nuclear proliferation.”

