Movatterモバイル変換


[0]ホーム

URL:


Home>Tech

A guide to 'deepfakes,' the internet's latest moral crisis

The new trend in porn will freak you out.
ByDamon Beres and Marcus Gilmer  on 
Share on FacebookShare on TwitterShare on Flipboard
Original image replaced with Mashable logo
Original image has been replaced.Credit: Mashable

Where there's innovation, there's masturbation — at least in one dark corner of the internet, where nearly 80,000 people have gathered to share fabricated videos of celebrity women having sex and Nicolas Cage uncovering the Ark of the Covenant.

These are "deepfakes," a new kind of video featuring realistic face-swaps. In short, a computer program finds common ground between two faces and stitches one over the other. If the source footage is good enough, the transformation is nearly seamless.

The technology is relatively easy to use, which has created an enthusiast community on Reddit, where users compare notes and swap their latest work: "Emma Watson sex tape demo ;-)," "Lela Star x Kim Kardashian," and "Giving Putin the Trump face" among them.

Original image replaced with Mashable logo
Original image has been replaced.Credit: Mashable

Motherboard didfoundational reporting on deepfakes in December and continues to cover the trend, with despairingly predictablenews last week that people are using the technology to create porn starring friends and classmates. But legal and computer science experts told Mashable that the technology's grimier applications shouldn't overshadow its potential for good, even if it's difficult to see the upside when non-consenting stars are being jammed into hardcore sex scenes with hundreds of thousands of views on Pornhub and Reddit.

The latter company didn't respond to requests for comment over the course of a week, but Pornhub said it will remove deepfakes from its platform.

"Users have started to flag content like this, and we are taking it down as soon as we encounter the flags," Corey Price, PornHub's vice president, said. "We encourage anyone who encounters this issue to visit ourcontent removal page so they can officially make a request."

Still, to be very clear: All of this should freak you out.

Original image replaced with Mashable logo
Original image has been replaced.Credit: Mashable

Above, we see Gal Gadot's face superimposed onto a porn actress, moments before she pulls her shirt off and gets felt up. Consent didn't factor into the equation for the Redditor who made this clip, and a casual observer wouldn't know the video is fake if they received the file from a friend via text message or email, because the transformation is so well done.

The issue is pretty simple: A person who has not consented to a sexual situation should not be put into that situation, whether in physical or virtual life. But the genie is out of the bottle, and it's staying there. "Gal Gadot" remains one of the top terms associated with deepfake searches on Google, as the company's own Trends data shows:

This underscores the urgency of the problem, even if it's an emerging one. Content published to the internet can be hard to erase, particularly when there's a group of people invested in duplicating and spreading it. People could stop creating new deepfakes tomorrow, but Gal Gadot's clips could live on indefinitely.

Want help? It's murky

There's not much legal recourse for those who fall victim to this new technology, according to Jonathan Masur, a professor who specializes in patent and technology law at the University of Chicago Law School. That's true even for private citizens.

"There's the copyright claim, if you took the [footage] yourself. There's the defamation claim if someone tries to say that it's actually you. And if you're a celebrity, there's a right to publicity claim if someone is trying to make money off of it," Masur explained. "But each of those is just a narrow slice of what's going on here that won't cover the vast majority of situations."

Many of the of these videos acknowledge they're fake, which undermines a defamation argument.

"[You] could try to make a case it represents a form of defamation if you're attacking the reputation of someone, but that's also pretty hard to do because, by definition, you're not alleging you're posting a pornographic picture of that individual," he said.

Mashable Light Speed
Want more out-of-this world tech, space and science stories?
Sign up for Mashable's weekly Light Speed newsletter.
By clicking Sign Me Up, you confirm you are 16+ and agree to ourTerms of Use andPrivacy Policy.
Thanks for signing up!

And, no, recent efforts to ban revenge pornography, led byMary Ann Franks and Danielle Citron, wouldn't be applied in these cases, because those laws pertain to the release of private images or video of an individual.

"There's no pornographic picture of the actual individual being released," Masur said. "It's just the individual's face on someone else's body."

Original image replaced with Mashable logo
Original image has been replaced.Credit: Mashable

There aren't any laws against this practice yet, nor have they been introduced. Tackling deepfakes via new legislation would be tricky, as doing so would bump against the First Amendment.

"From a civil liberties perspective, I am... concerned that the response to this innovation will be censorial and end up punishing and discouraging protected speech," David Greene, the civil liberties director at theElectronic Frontier Foundation, a nonprofit focused on digital free speech, said.

"It would be a bad idea, and likely unconstitutional, for example, to criminalize the technology," he added.

The unexpected upside

Greene's concerns may not be unfounded. Though deepfakes are now synonymous with porn, the basic concept behind the technology is facial recognition, which theoretically has a lot of upside to be explored.

You may already be familiar with basic, live facial recognition from apps like Snapchat. The technology is programmed to map faces according to "landmark" points. These are features like the corners of your eyes and mouth, your nostrils, and the contour of your jawline.

Snapchat is pretty good at understanding your face and applying transformative effects, which augment your features:

Original image replaced with Mashable logo
Original image has been replaced.Credit: Mashable

But its face-swapping feature leaves something to be desired:

Original image replaced with Mashable logo
Original image has been replaced.Credit: Mashable

Part of that has to do with Snapchat working in real-time — it's trading speed for accuracy.

Deepfakes work differently. The "FakeApp" program uses artificial intelligence to complete three major steps: alignment, training, and merging. Instead of placing one face over another in real-time, the FakeApp uses hundreds of still-frame images pulled from video footage. It digs through all of those images, identifies faces, and analyzes how they're lit, what expressions they're making, and so on. Once the program understands the faces it's working with, it can use all of its "knowledge" to stitch one over the other.

Though it's been put to a gross purpose, deepfakes' seamlessness could be an encouraging sign, depending on your perspective. With enough development, real-time face swaps could achieve similar quality to deepfakes, which may have therapeutic uses, according to Dr. Louis-Philippe Morency, director of theMultiComp Lab at Carnegie Mellon University.

"This technology has very important applications apart from entertainment," he said.

One moonshot example: Dr. Morency said soldiers suffering from post-traumatic stress disorder could eventually video-conference with doctors using similar technology. An individual could face-swap with a generic model without sacrificing the ability to convey his or her emotions. In theory, this would encourage people to get treatment who might otherwise be deterred by a perceived stigma, and the quality of their treatment wouldn't suffer due to a doctor being unable to read their facial cues.

Another one of Dr. Morency's possibilities — and its own can of worms — would be to use models in video interviews to remove gender or racial bias when hiring. But for any of this to happen, researchers need more data, and open-source, accessible programs like FakeApp can help create that data.

"The way to move forward with AI research is to share the code, and share the data. This is an enabler for AI research," Dr. Morency said.

It somehow gets worse

As with many emerging technologies, the scariest part may be unseen. When Facebook first rolled out on college campuses, few would have anticipated its transformation into a multimedia Goliath that potentially destabilized American democracy as we knew it — but here we are.

Like the "fake news" that has exhausted so many of us on Facebook, deepfakes represent yet another capacity for the internet to breach our shared reality. If every video clip could potentially be fake, why believe anything is real?

And so, expect the response from your unborn grandchild: "Raiders of the Lost Ark? You mean the one with Nicolas Cage?"


Featured Video For You
Why Lena Headey is campaigning to support refugee families in Europe

TopicsCelebrities

Mashable Image
Damon Beres

Damon Beres is an Executive Editor at Mashable, overseeing tech and science coverage. Previously, he was Senior Tech Editor at The Huffington Post. His work has appeared in Reader's Digest, Esquire.com, the New York Daily News and other fine outlets.


Recommended For You
Congress passes ‘Take It Down’ Act to fight AI-fueled deepfake pornography
The bill had bipartisan support as well as tech industry backers.
a poster board in support of the 'take it down' act
a poster board in support of the 'take it down' act

Good riddance: The web's top deepfake porn site is shutting down
'Mr. Deepfakes' just announced the sudden shutdown on its website.
woman with binary code projected on her face
woman with binary code projected on her face


Amazon deal of the day: Last chance to snag the Kindle Colorsoft at its best price ever
You can also slash up to 61% off the Echo Buds, Fitbit Charge 6, Beats Studio Buds+, and Peloton Guide.
Fitbit Charge 6, Peloton Guide, Beats Studio Buds+, Amazon Echo Buds, and Amazon Kindle Colorsoft with pale pink and blue background
Fitbit Charge 6, Peloton Guide, Beats Studio Buds+, Amazon Echo Buds, and Amazon Kindle Colorsoft with pale pink and blue background


Trending on Mashable
A NASA rover sent home an immersive Mars panorama. Watch the video.
Those aren't mountains in the background.
Curiosity exploring Gale Crater
Curiosity exploring Gale Crater

Wordle today: Answer, hints for May 22, 2025
Here are some tips and tricks to help you find the answer to "Wordle" #1433.
Wordle game on a smartphone
Wordle game on a smartphone

NYT Connections hints today: Clues, answers for May 22, 2025
Everything you need to solve 'Connections' #711.
Connections game on a smartphone
Connections game on a smartphone

A NASA Mars rover looked up at a moody sky. What it saw wasn't a star.
Perseverance captures a rare sight from the surface of the Red Planet.
Martian moon Deimos looks like a star
Martian moon Deimos looks like a star

NYT Strands hints, answers for May 22
Every hint, nudge and outright answer you need to complete today's NYT Strands puzzle.
A game being played on a smartphone.
A game being played on a smartphone.
The biggest stories of the day delivered to your inbox.
These newsletters may contain advertising, deals, or affiliate links. By clicking Subscribe, you confirm you are 16+ and agree to ourTerms of Use andPrivacy Policy.
Thanks for signing up. See you at your inbox!

[8]ページ先頭

©2009-2025 Movatter.jp