Skip to Content
Artificial intelligence

Dear Taylor Swift, we’re sorry about those explicit deepfakes

You have a platform and the power to convince lawmakers across the board that rules to combat these sorts of deepfakes are a necessity.

Taylor Swift performs in front of a giant screen, amplifying her face.
Kevin Winter/Getty

This story originally appeared in The Algorithm, our weekly newsletter on AI. To get stories like this in your inbox first, sign up here.

Hi, Taylor.

I can only imagine how you must be feeling after sexually explicit deepfake videos of you went viral on X. Disgusted. Distressed, perhaps. Humiliated, even. 

I’m really sorry this is happening to you. Nobody deserves to have their image exploited like that. But if you aren’t already, I’m asking you to be furious. 

Furious that this is happening to you and so many other women and marginalized people around the world. Furious that our current laws are woefully inept at protecting us from violations like this. Furious that men (because let’s face it, it’s mostly men doing this) can violate us in such an intimate way and walk away unscathed and unidentified. Furious that the companies that enable this material to be created and shared widely face no consequences either, and can profit off such a horrendous use of their technology. 

Deepfake porn has been around for years, but its latest incarnation is its worst one yet. Generative AI has made it ridiculously easy and cheap to create realistic deepfakes. And nearly all deepfakes are made for porn. Only one image plucked off social media is enough to generate something passable. Anyone who has ever posted or had a photo published of them online is a sitting duck. 

First, the bad news. At the moment, we have no good ways to fight this. I just published a story looking at three ways we can combat nonconsensual deepfake porn, which include watermarks and data-poisoning tools. But the reality is that there is no neat technical fix for this problem. The fixes we do have are still experimental and haven’t been adopted widely by the tech sector, which limits their power. 

The tech sector has thus far been unwilling or unmotivated to make changes that would prevent such material from being created with their tools or shared on their platforms. That is why we need regulation. 

People with power, like yourself, can fight with money and lawyers. But low-income women, women of color, women fleeing abusive partners, women journalists, and even children are all seeing their likeness stolen and pornified, with no way to seek justice or support. Any one of your fans could be hurt by this development. 

The good news is that the fact that this happened to you means politicians in the US are listening. You have a rare opportunity, and momentum, to push through real, actionable change. 

I know you fight for what is right and aren’t afraid to speak up when you see injustice. There will be intense lobbying against any rules that would affect tech companies. But you have a platform and the power to convince lawmakers across the board that rules to combat these sorts of deepfakes are a necessity. Tech companies and politicians need to know that the days of dithering are over. The people creating these deepfakes need to be held accountable. 

You once caused an actual earthquake. Winning the fight against nonconsensual deepfakes would have an even more earth-shaking impact.

Deep Dive

Artificial intelligence

Google DeepMind used a large language model to solve an unsolved math problem

They had to throw away most of what it produced but there was gold among the garbage.

AI for everything: 10 Breakthrough Technologies 2024

Generative AI tools like ChatGPT reached mass adoption in record time, and reset the course of an entire industry.

What’s next for AI in 2024

Our writers look at the four hot trends to watch out for this year

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at with a list of newsletters you’d like to receive.