WASHINGTON, June 7 (Reuters) – The Federal Bureau of Investigation has warned Americans that criminals are increasingly using artificial intelligence to create sexually explicit images to intimidate and blackmail victims.
In an alert issued this week, the agency said it had recently observed a rise in the number of extortion victims and said they had been targeted using doctored versions of innocuous photos taken from online posts, private messages or video chats.
“The images are then sent directly to the victims by malicious actors for sextortion or harassment,” the alert said. “Once circulated, victims may face significant challenges in preventing continued sharing of the manipulated content or removal from the Internet.”
The agency said the images appeared “realistic” and that in some cases children had been targeted.
The FBI did not go into detail about the program or programs used to generate the sexual images, but noted that technological advances “continually improve the quality, customization, and accessibility of artificial intelligence (AI)-enabled content creation.”
The agency did not respond to a follow-up message seeking details about the phenomenon Wednesday.
The manipulation of innocent images to create sexually explicit images is almost as old as photography itself, but the release of open source AI tools has made the process easier than ever. The results are often indistinguishable from real images, and several websites and social media channels specializing in creating and sharing AI-enabled sexual images have emerged in recent years.
Reporting by Raphael Satter; Editing by David Gregorio
Our standards: Thomson Reuters Trust Principles.
Reporter covering cyber security, surveillance and disinformation for Reuters. The work has included investigating state-sponsored espionage, deepfake-fueled propaganda and mercenary hacking.