Advertisement

White House, Microsoft, SAG-AFTRA Respond to Crude AI Images of Taylor Swift

"This is very alarming," the White House says

Advertisement
White House, Microsoft, SAG-AFTRA Respond to Crude AI Images of Taylor Swift
Taylor Swift, photo via Getty

    Sexually-explicit AI-generated images of Taylor Swift have caused alarm across the board, with major centers of power — including The White House, Microsoft, SAG-AFTRA, and more — now weighing-in on what the controversy means for the future of AI, and what steps are being taken to prevent further incidents like this one.

    The scandal began when fake, crude images depicting Swift started circulating social media this week, with one image gaining over 47 million views on Twitter alone, before the account that posted them was suspended due to mass-reporting by Swifites. According to 404 Media, the viral images were traced back to a Telegram group chat where members shared AI content, sometimes made using Microsoft’s generative-AI tool, Designer.

    Now, speaking to NBC News, Microsoft CEO Satya Nadella has expressed that the company finds the images “alarming and terrible,” and feels the pressure to “move fast” to combat nonconsensual sexually explicit deepfake images.

    Advertisement

    “Yes, we have to act,” Nadella said. “I think we all benefit when the online world is a safe world. And so I don’t think anyone would want an online world that is completely not safe for both content creators and content consumers.” Continuing, Nadella asserted that it is Microsoft’s “responsibility” to place “guardrails” around their technology, “so that there’s more safe content that’s being produced… there’s a lot to be done and a lot being done there.”

    For their part, Joe Biden’s administration expressed concern over the images. “This is very alarming,” White House Press Secretary Karine Jean-Pierre said at a news briefing, per Reuters. She then suggested that Congress could take legislative action on the issue, and called on social media companies to prevent the spread of such content.

    Get Taylor Swift Tickets Here

    “While social media companies make their own independent decisions about content management, we believe they have an important role to play in enforcing their own rules to prevent the spread of misinformation, and non consensual, intimate imagery of real people,” Jean-Pierre said.

    Advertisement

    Weighing in, SAG-AFTRA released an official statement calling for “the development and dissemination of fake images — especially those of a lewd nature — without someone’s consent” to be made officially illegal. “As a society, we have it in our power to control these technologies, but we must act now before it is too late,” the statement said, expressing support for the Preventing Deepfakes of Intimate Images Act, a new piece of legislation sponsored by US Rep. Joe Morelle.

    SAG-AFTRA’s statement also drew a comparison between the Swift images and a recent story regarding an AI-generated “comedy special” from a bot imitating George Carlin, which is the subject of a new lawsuit brought by Carlin’s estate.

    Expressing support for Carlin’s family — stating that “families should not have to see their loved ones exploited for profit” — the SAG-AFTRA statement concluded by saying that “everything generated by AI originates from a human creative source and human creativity must be protected.”

    Advertisement

    In less-controversial AI news, Guns N’ Roses recently shared an AI-generated music video for their song, “The General.” Additionally, Jimmy Stewart and Edith Piaf are both being resurrected (consensually) for new projects.

    As for the real-life Taylor Swift, the new controversy probably won’t help with the “Pentagon psyop” accusations from Fox News, but that won’t stop her momentum. Next month, she’s set to launch the latest leg of “The Eras Tour,” which will run through December. Check out the full list of upcoming dates, and grab your tickets for her North American stops here (for her international dates, secure your seats here).

Advertisement
×