Air Force official’s musings on rogue drone targeting humans go viral
By Stephen Losey and Colin Demarest
Jun 2, 11:41 AM
The U.S. Air Force is developing a fleet of artificial intelligence-enabled drone wingmen to fly alongside piloted fighters. But a "thought experiment" from the service that went viral highlights the possible dangers that could come from giving autonomous drones too much power without taking ethics into consideration.
WASHINGTON — The U.S. Air Force walked back comments reportedly made by a colonel regarding a simulation in which a drone outwitted its artificial intelligence training and killed its handler, after the claims went viral on social media.
Air Force spokesperson Ann Stefanek said in a June 2 statement no such testing took place, adding that the service member’s comments were likely “taken out of context and were meant to be anecdotal.”
“The Department of the Air Force has not conducted any such AI-drone simulations and remains committed to ethical and responsible use of AI technology,” Stefanek said. “This was a hypothetical thought experiment, not a simulation.”
The killer-drone-gone-rogue episode was initially attributed to Col. Tucker “Cinco” Hamilton, the chief of AI testing and operations, in a recap from the Royal Aeronautical Society’s FCAS23 Summit in May. The summary was later updated to include additional comments from Hamilton, who said he misspoke at the conference.
https://www.armytimes.com/unmanned/uas/2023/06/02/air-force-officials-musings-on-rogue-drone-targeting-humans-go-viral/