Advertisement
News

AI-Generated Taylor Swift Porn Went Viral on Twitter. Here's How It Got There

404 Media traced back the viral and abusive images of Taylor Swift to a specific Telegram community. That group sometimes uses Microsoft's AI tools.
AI-Generated Taylor Swift Porn Went Viral on Twitter. Here's How It Got There
Photo by Rosa Rafael / Unsplash
🖥️
404 Media is reported and written by humans for humans. Sign up above for free access to this article.

Sexually explicit AI-generated images of Taylor Swift went viral on Twitter after jumping from 4chan and a specific Telegram group dedicated to abusive images of women, 404 Media has found. At least one tool the group uses is a free Microsoft text-to-image AI generator.

Examples viewed by 404 Media had tens of thousands of bookmarks and likes and thousands of reposts. The Verge reported that one of the most viral examples received 45 million views and 24,000 reposts, and was up for 17 hours prior to its removal, when Twitter deleted the account of the original poster, as well as other accounts that posted similar images. 

Swifties, being the highly motivated stan army that they are, mobilized to drown out the trending phrase with “protect Taylor Swift” tweets and buried many of the viral abuse images from the search results of the phrase "AI Taylor Swift" on Twitter. But other examples remain on Twitter as of writing.   

404 Media found that the images originated on 4chan and a Telegram group dedicated to making non-consensual AI generated sexual images of women. 

Advertisement