To Innovation - Breaking Politics, Economics, Crypto & IT News

learn more
Reporting from Watford, UK and LA, US since 1996
learn more

Traces of water found on Mars

At least in the distant past the Red planet had a liquid bodies of water

Call to ban Lethal autonomous weapon will go in vain

If AI will prove effective the ban will be a nonsense

20.Feb.23 7:54 AM
By Shawn Highstraw
Photo Time


Call to ban Lethal autonomous weapon will go in vain
There are hardly any rules for the use of military artificial intelligence. Now that the developments of AI are going so fast, it should. If a weapon decides to shoot a target on its own, can anyone else be held responsible?

We need to act quickly to use military AI in the right way. That was the conclusion reached by Minister Wopke Hoekstra (Foreign Affairs) on Thursday afternoon during the closing ceremony of REAIM. For two days, the World Forum in The Hague focused on major issues related to artificial intelligence.

The conference revolved around the 'responsible use of AI in the military domain'. You might think of killer robots. This is what we call weapons that work completely independently. Without human intervention, but with the help of AI. These types of weapons are not yet used in conflicts, but they are under development.

Call for a ban on autonomous weapons

Amnesty International is concerned about this. "Warfare is never clear or orderly," said Agnès Callamard, Secretary-General of the human rights organization. "Wars are nasty and take advantage of bias. You can instruct AI to take out targets with a certain background or external characteristics. AI makes weapons more precise and deadly. And don't think that only people with good intentions use these types of weapons."

So something has to be done quickly, says Callamard. It calls for binding international regulations. In this she is not alone. "Governments need to set clearer rules about what can and cannot be done," says Marietje Schaake, director of international policy at Stanford's Cyber Policy Center. "Think of a ban on self-thinking weapons."

British computer scientist Stuart Russell fervently advocates such a ban. "If we do not prohibit or at least limit these types of developments, they will become weapons of mass destruction," he says. "One person can press a button and kill millions of people. There are already international treaties to prohibit the use of biological weapons. That's what we have to do with this."

Back to the list

Related Information: