'Deepfake' Technology Could Put Anyone In Explicit Videos, So These Websites Are Taking Action

“Creating fake sex scenes of celebrities takes away their consent. It’s wrong.”

As we saw in the 2014 nude photo leaks, some Internet users care very little for the privacy of others. Their latest offense is pornography made with computerized face-swap technology, which can convincingly superimpose the faces of public figures onto the bodies of pornography actors. These videos are called "deepfakes," and web platforms such as Gfycat and Twitter are endeavoring to stop them.


According to The Guardian, deepfakes have become more prevalent since the January release of a desktop app that streamlines the machine learning technology. On Reddit's former "deepfakes" subreddit, to which 90,000 Redditors subscribed, content creators used this tech for videos both innocuous — such as the Nicolas Cage face-swaps you see below — and explicit. In particular, users have been putting the faces of females celebrities into pornography clips. Motherboard reports one Redditor has made hardcore pornography videos featuring the faces of Scarlett Johansson, Maisie Williams, Taylor Swift, Aubrey Plaza, and Gal Gadot. Two weeks ago, The Guardian reported that Daisy Ridley, Sophie Turner, and Emma Watson have also appeared in deepfakes.

"One important thing that always needs to happen is consent," porn performer Grace Evangeline told Motherboard. "Consent in private life as well as consent on film. Creating fake sex scenes of celebrities takes away their consent. It's wrong."

And though public figures have been the primary targets so far, it's conceivable that anyone — famous or otherwise — could fall victim to this kind of digital manipulation. "This new type of fake porn shows that we're on the verge of living in a world where it's trivially easy to fabricate believable videos of people doing and saying things they never did," Motherboard's Samantha Cole wrote. "Even having sex."

Gycat, a GIF sharing platform, has banned the deepfake porn, citing its terms of service, which empower its moderators to remove "objectionable" content. 

Now Twitter has followed suit. The social network does allow adult content, as long as it is flagged as "sensitive," but it will not allow deepfake pornography. "We will suspend any account we identify as the original poster of intimate media that has been produced or distributed without the subject's consent," the company told Motherboard this month. (When a reporter for that website directed Twitter to an offending account called @mydeepfakes, the company suspended the account within hours.)

Even Pornhub — a streaming service for sexually explicit videos — is taking action against deepfakes. "[Pornhub takes] a hard stance against revenge porn, which we believe is a form of sexual assault… Regarding deepfakes, Users have started to flag content like this and we are taking it down as soon as we encounter the flags."

Per The Guardian, moderation on Reddit is volunteer-based, though its centrally-enforced rules do prohibit "involuntary pornography," or "the posting of photographs, videos, or digital images of any person in a state of nudity or engaged in any act of sexual conduct, taken or posted without their permission."  On Feb. 7, after Pornhub, Twitter, and Gfycat had already taken action, Reddit removed the "deepfakes" subreddit.


Subscribe to our newsletter and get the latest news and exclusive updates.