Drone advances amid the war in Ukraine could bring combat robots to the front lines

KYIV, Ukraine (AP) — The advancement of drones in Ukraine has accelerated a long-awaited technological trend that could soon bring the world’s first fully autonomous combat robots to the battlefield, ushering in a new era of warfare.

The longer the war, the more likely it is that drones will be used to locate, select, and attack targets without human assistance, according to military analysts, combatants, and AI researchers.

Read more: Russia, rocked by the Ukraine strike, may step up use of drones

This would mark a revolution in military technology as profound as the introduction of the machine gun. Ukraine already has semi-autonomous attack drones and AI-enabled anti-drone weapons. Russia also claims to have AI weapons, although the claims are unproven. But there are no confirmed cases of a country putting in combat robots that killed entirely on their own.

Experts say it may only be a matter of time before Russia, Ukraine, or both publish it.

“Many countries are developing this technology,” said Zachary Kalinborn, a weapons innovation analyst at George Mason University. “It’s clearly not that difficult.”

The sense of inevitability extends to activists, who have tried for years to ban Killer drones But they now believe they must settle for trying to limit the offensive use of weapons.

Ukraine’s Minister for Digital Transformation, Mykhailo Fedorov, agrees that fully autonomous killer drones are a “logical and inevitable next step” in weapons development. He said that Ukraine is doing “a lot of research and development in this direction.”

“I think the possibility of this happening in the next six months is great,” Fedorov told the Associated Press in a recent interview.

Ukrainian Lieutenant Colonel Yaroslav Honchar, co-founder of the innovative combat nonprofit Aerorozvidka, said in a recent interview near the front that human fighters simply cannot process information and make decisions as quickly as machines.

He said that Ukrainian military leaders currently prohibit the use of fully autonomous lethal weapons, although that could change.

“We haven’t crossed that line yet — and I say ‘yet’ because I don’t know what will happen in the future,” said Honchar, whose group has spearheaded drone innovation in Ukraine, turning cheap commercial aircraft into lethal weapons.

Russia can get independent AI from Iran or anywhere else. The Iranian-supplied long-range Shahed-136s have crippled Ukrainian power plants and terrorized civilians, but they are not particularly smart. Iran has other drones in its advanced arsenal that it says feature artificial intelligence.

Without a great deal of trouble, its Western manufacturers say, Ukraine could make its armed semi-autonomous armed drones fully autonomous in order to better survive battlefield jamming.

These drones include the US-made Switchblade 600 and Polish Warmate, both of which currently require a human to select targets over a live video feed. Artificial intelligence finishes the job. Drones, technically known as “loitering munitions” He can hover for minutes over the target, waiting for a clean shot.

Read more: Ukraine’s Minister of Technology said that Ukraine will develop reconnaissance and combat drones

“The technology to achieve a fully autonomous mission with the Switchblade is very much in place today,” said Wahid Nawabi, CEO of AeroVironment, its manufacturer. This will require a change in policy – to remove the human being from the decision-making loop – which he estimates is three years away.

Drones can already identify targets such as armored vehicles using indexed images. But there is disagreement about whether the technology is reliable enough to ensure that machines do not make mistakes and take the lives of non-combatants.

The AP asked the defense ministries of Ukraine and Russia whether they have used autonomous weapons offensively — and whether they would agree not to use them if the other side agreed in kind. Neither of them responded.

If either side attacks with full AI, it may not be the first.

An inconclusive UN report indicated that killer robots first appeared in the Libyan internal conflict in 2020 when Turkish-made Kargu-2 drones in full automatic mode killed an unspecified number of combatants.

A spokesperson for STM, the manufacturer, said the report was based on “unverified, speculative” information and “should not be taken lightly”. He told the AP that the Kargu-2 cannot attack a target until the operator tells it to do so.

A completely independent artificial intelligence helps defend Ukraine. Utah-based Fortem Technologies has provided the Ukrainian military with drone-hunting systems that combine small radars with unmanned aerial vehicles, both of which are powered by artificial intelligence. The radars are designed to identify enemy drones, which the drones then disable by firing nets at them — all without human assistance.

The number of drones equipped with artificial intelligence is increasing. Israel has been exporting it for decades. The Harpy killer radar can hover over anti-aircraft radar for up to nine hours waiting to be triggered.

Another example is a Blowfish-3 armed unmanned helicopter from Beijing. Russia is developing a nuclear-tipped underwater drone called the Poseidon. The Dutch are currently testing a ground-based robot with a . 50-caliber machine gun.

Honchar believes that Russia, whose attacks on Ukrainian civilians have shown little regard for international law, would now have used lethal autonomous drones if the Kremlin had them.

“I don’t think they’ll have any scruples,” agreed Adam Bartosevich, vice president of WB Group, which makes Warmate.

Artificial intelligence is a priority for Russia. President Vladimir Putin said in 2017 that whoever controls this technology will rule the world. In a speech on December 21, he expressed confidence in the Russian arms industry’s ability to embed artificial intelligence into war machines, stressing that “the most effective weapon systems are those that operate quickly and practically in an automatic mode.”

Russian officials are already claiming that their Lancet drone can operate with complete autonomy.

“It’s not going to be easy to know if and when Russia crosses that line,” said Gregory C. Allen, former director of strategy and policy at the Pentagon’s Joint Artificial Intelligence Center.

Transforming a drone from remote piloting to full autonomy may not be as intuitive. So far, Allen said, drones capable of operating in both modes have fared better when piloted by a human.

The technology is not particularly complex, said Stuart Russell, a professor at the University of California, Berkeley, who is a leading researcher in the field of artificial intelligence. In mid-2010, fellows surveyed agreed that graduate students could, in one semester, produce an autonomous drone “capable of finding and killing an individual, let’s say, inside a building,” he said.

Read more: Russian drones explode in a night attack on Kyiv

Efforts to establish international ground rules for military drones have so far been futile. Nine years of informal UN talks in Geneva made little progress, with major powers including the United States and Russia opposing the ban. The last session ended last December without setting a date for a new round.

Policymakers in Washington say they will not agree to the ban because competitors developing drones cannot be trusted to use them ethically.

Toby Walsh, an Australian academic who, like Russell, campaigns against killer robots, hopes to achieve consensus on some of the restrictions, including banning systems that use facial recognition and other data to identify or attack individuals or classes of people.

“If we are not careful, they will spread much more easily than nuclear weapons,” said Walsh, author of Machines Behaving Badly. “If you can get a bot to kill one person, you can get it to kill a thousand people.”

Scientists are also concerned that AI weapons will be reused by terrorists. In one frightening scenario, the US military spends hundreds of millions writing code to operate killer drones. It is then stolen and copied, effectively giving terrorists the same weapon.

To date, Allen, the former Defense Department official, said, the Pentagon has not clearly defined an “AI-enabled autonomous weapon” nor authorized a single such weapon for use by US forces. Any proposed system must be approved by the Chairman of the Joint Chiefs of Staff and two undersecretaries.

This does not prevent weapons from being developed throughout the United States. Projects are underway at the Defense Advanced Research Projects Agency, military laboratories, academic institutions, and the private sector.

The Pentagon has emphasized the use of artificial intelligence to augment human warriors. The Air Force is studying ways to pair pilots with drone wings. One backer of the idea, former Deputy Defense Secretary Robert Work, said in a report last month that “it would be crazy not to go autonomous” once AI-enabled systems can outpace humans — a threshold he said was crossed in 2015 when the vision of Computer to see humans.

Humans have already been kicked into some defense systems. Israel’s Iron Dome missile shield is allowed to open fire automatically, though it is said to be monitored by someone who can intervene if the system goes after the wrong target.

Numerous countries, and every branch of the US military, are developing drones that can attack in deadly synchronized swarms, according to Kalinborn, a researcher with George Mason.

Will future wars become a battle to the last drone?

This is what Putin predicted in a 2017 televised conversation with engineering students: “When another party’s drones are destroyed, he will have no other choice but to surrender.”

Frank Bagack reported from Boston. Associated Press journalists Tara Cobb in Washington, Garance Burke in San Francisco, and Susan Fraser in Turkey contributed to this report.

Leave a Comment