Commentary: What killer robots mean for the future of war
We have seen how so-called neutral AI have made sexist algorithms and inept content moderation systems. In war, these kinds of misunderstandings could kill civilians or wreck negotiations, say these writers.
Jonathan Erskine
Miranda Mowbray
16 Jan 2023 06:07AM
(Updated: 16 Jan 2023 06:07AM)
BRISTOL: You might have heard of killer robots, slaughterbots or terminators - officially called lethal autonomous weapons - from films and books. And the idea of super-intelligent weapons running rampant is still science fiction. But as artificial intelligence (AI) weapons become increasingly sophisticated, public concern is growing over fears about lack of accountability and the risk of technical failure.
Already we have seen how so-called neutral AI have made sexist algorithms and inept content moderation systems, largely because their creators did not understand the technology. But in war, these kinds of misunderstandings could kill civilians or wreck negotiations.
For example, a target recognition algorithm could be trained to identify tanks from satellite imagery. But what if all of the images used to train the system featured soldiers in formation around the tank? It might mistake a civilian vehicle passing through a military blockade for a target.
WHY DO WE NEED AUTONOMOUS WEAPONS?
https://www.channelnewsasia.com/commentary/war-killer-robot-ai-lethal-autonomous-weapons-3203846