The people on 4chan who created the images of Ms. Swift thought of it as a sort of game, the researchers said.
Images of Taylor Swift that had been generated by artificial intelligence and had spread widely across social media in late January probably originated as part of a recurring challenge on one of the internet’s most notorious message boards
Graphika, a research firm that studies disinformation, traced the images back to one community on 4chan, a message board known for sharing hate speech, conspiracy theories and, increasingly, racist and offensive content created using A.I.
The people on 4chan who created the images of the singer did so in a sort of game, the researchers said — a test to see whether they could create lewd (and sometimes violent) images of famous female figures.
The synthetic Swift images spilled out onto other platforms and were viewed millions of times. Fans rallied to Ms. Swift’s defense, and lawmakers demanded stronger protections against A.I.-created images.
Graphika found a thread of messages on 4chan that encouraged people to try to evade safeguards set up by image generator tools, including OpenAI’s DALL-E, Microsoft Designer and Bing Image Creator. Users were instructed to share “tips and tricks to find new ways to bypass filters” and were told, “Good luck, be creative.”
Sharing unsavory content via games allows people to feel connected to a wider community, and they are motivated by the cachet they receive for participating, experts said. Ahead of the midterm elections in 2022, groups on platforms like Telegram, WhatsApp and Truth Social engaged in a hunt for election fraud, winning points or honorary titles for producing supposed evidence of voter malfeasance. (True proof of ballot fraud is exceptionally rare.)
Thank you for your patience while we verify access. If you are in Reader mode please exit and log into your Times account, or subscribe for all of The Times.
Thank you for your patience while we verify access.
Already a subscriber? Log in.
Want all of The Times? Subscribe.