The Era Of AI-Generated 'Fake Porn' Has Arrived

They call them "deepfakes."

It's the term for pornography made using artificial intelligence-assisted technology to superimpose a person's face on another performer's body - essentially allowing the producer to create fake porn featuring celebrities, politicians or even average every people.

In a report published Wednesday, Motherboard recounted how they discovered a user on Reddit responsible for producing convincing porn videos featuring celebrities like Gal Gadot, Maisie Williams and Taylor Swift.

Pretty soon, the technology used to create "deepfakes" will be widely available enough to be used by extortionists and criminals with only a cursory understanding of how the software works. Another redditor discovered by Motherboard even created an app specifically designed to allow users without a computer science background to create AI-assisted fake porn. All the tools one needs to make these videos are free, readily available, and accompanied with instructions that walk novices through the process.

Two months ago, the first redditor mentioned above created a subreddit dedicated to the practice.

In that short time, the subreddit has already amassed more than 15,000 subscribers. Within the community, the word “deepfake” itself is now a noun for the kinds of neural-network generated fake videos their namesake pioneered, according to Motherboard.

Another "deepfake" auteur created an app called FakeApp, a user-friendly application that allows anyone to recreate these videos with their own datasets. The app is based on deepfakes' algorithm, but another user who goes by deepfakeapp created FakeApp without the help of the original deepfakes. While none of these people divulged their identity to Motherboard, the user known as Deepfakeapp said in a direct message that his goal with creating FakeApp was to make deepfakes’ technology available to people without a technical background or programming experience.

“I think the current version of the app is a good start, but I hope to streamline it even more in the coming days and weeks,” he said. “Eventually, I want to improve it to the point where prospective users can simply select a video on their computer, download a neural network correlated to a certain face from a publicly available library, and swap the video with a different face with the press of one button.”

Peter Eckersley, chief computer scientist at the Electronic Frontier Foundation, fears the technology will soon reach the point where fakes are virtually indistinguishable for authentic videos.

“I think the current version of the app is a good start, but I hope to streamline it even more in the coming days and weeks,” he said. “Eventually, I want to improve it to the point where prospective users can simply select a video on their computer, download a neural network correlated to a certain face from a publicly available library, and swap the video with a different face with the press of one button.”

Fakes posted in the subreddit have already been pitched as real on other websites; a deepfake of Emma Watson taking a shower was uploaded by CelebJihad, a celebrity porn site that regularly posts hacked celebrity nudes, as a “never-before-seen video" purportedly from the user's "private collection."

 

Fake

Here's an example of a "deepfake"

Other redditors have taken video trained from celebrities' Instagram accounts and used it to convincingly fake Snapchat messages.

"Deepfakes" are hardly a new phenomenon. Last July, we reported on a project conducted by Stanford's Matthias Niessner that managed to create several faked videos of former US President Barack Obama.

Soon, this technology could create problems for everybody, from governments, to corporations to the news media - which will now find it even more difficult to distinguish veritable "Fake News" from reality.

Comments

shitshitshit virgule Fri, 01/26/2018 - 04:08 Permalink

The new trend in AI comes from improved minimization algorithms going hand in hand with better sorting and semantic analysis. 

Not a bad thing overall but not really intelligent. Just very very efficient in doing mundane tasks very fast. 

Exactly what is needed to track down and trace people on the internet.

Ask farcebook. 

In reply to by virgule

Theosebes Goodfellow cookies anyone Fri, 01/26/2018 - 07:12 Permalink

~" Soon, this technology could create problems for everybody, from governments, to corporations to the news media - which will now find it even more difficult to distinguish veritable "Fake News" from reality."~

Wait, don't we have that problem already without this technology? Someone shove a mic in the face of a WaPo or NYT reporter. These mothertruckers can't tell the truth now, imagine them with at will AI-aided face-swapping.

In the not-to-distant future, they will be able to generate a complete scene w/o the target's presence. What good would a veil or mask do you when your likeness can be hijacked, invariably to do you harm?

 

In reply to by cookies anyone

Kefeer Theosebes Goodfellow Fri, 01/26/2018 - 07:25 Permalink

The whole idea of "fake news", which has almost always been biased and opinion, is designed to make people take side and/or question everything to the point where they do not know what to think or believe.  Divide and conquer; which is why your world-view from which you filter your thoughts is so important because your thoughts will translate to action and words.  Sadly for most it is secular humanism where truth/morality changes with the direction of the wind or culture. 

Always learning, but never able to come to the knowledge of the Truth!  Makes for an emptiness within oneself that one cannot fill nor understands.  God's word is the standard by which all morality will be judged because it is truth, which does not change.  Marriage, for example, is between one man and one woman - everything else is opinion based on cultural bias, not truth or that which is "natural".

In reply to by Theosebes Goodfellow

DCFusor philipat Fri, 01/26/2018 - 09:22 Permalink

Yup.  I've posted much the same on a bunch of tech sites too dumb to see the other side of the coin.  In a fair world (which we don't have, I know), this means all video, and with some other tech, audio evidence has to be considered fake till proven otherwise, and we know there have been massive problems in "chain of evidence" by the authorities - drug labs taking the drugs to test them, cameras that are used to "re-enact" crime scenes after planting evidence and so on.

But it does more obviously put things back to a "he said, she said" situation, where it's all what someone believes, the physical "evidence" is now meaningless.  Now, of course courts tend to believe people wearing blue uniforms more than a citizen, but twas always thus.  If they want you, they'll find a way.  But private blackmail?  You should be able to just laugh it off if that's the best someone can do.

In reply to by philipat

toady philipat Fri, 01/26/2018 - 10:54 Permalink

Exactly. Plausible deniability. The applications for this go well beyond porn, but, once again, porn is leading the way.

Imagine, terrorists have taken control of a nuke silo/mobile launcher somewhere  (us/india/paki/china/russia) but no one is aware yet.... put that countries leaders face on a body and announce the attack...

Hell, you don't even need to take control of nuke, just wait for a rocket test, any rocket test, then broadcast as that countries leader...

The possibilities are endless...

In reply to by philipat

aurum4040 ACP Fri, 01/26/2018 - 00:22 Permalink

It may for a while. But this is yet another reason why POE platform will be huge. Every digital asset will be verifiable via POE platform. In the event a fake video such as this is submitted for POE verification, it will be able to be deconstructed and found out. Blockchain in general will weed this stuff out and it will videos such as this will not be taken seriously. Eventually. 

In reply to by ACP

GUS100CORRINA Bes Fri, 01/26/2018 - 00:10 Permalink

The Era Of AI-Generated 'Fake Porn' Has Arrived

Augmented reality!!!! Pretty soon the images we see on TV will be COMPUTER GENERATED hallucinations including political messages.

For example, HRC will be dead in a year or so because she was diagnosed with a condition known as subcortical vascular dementia in 2013. Were they planning to feed us information from the POTUS with a COMPUTER GENERATED hallucination if HRC was POTUS?

Think about it!!! Really scary stuff and we are NOT out of the woods yet. Also, according to Q-ANON, "OBOZO" is all lawyered up with law firm Wilkinson and Walch located in Washington DC.

See link below from the STILL REPORT.

https://www.youtube.com/watch?v=tLgcWCEOMtM

 

In reply to by Bes

jmack Bes Fri, 01/26/2018 - 03:10 Permalink

you are missing a quick secondary use.   #1  spoofing will go  video.  someone will spoof your loved one and get private info, perhaps even bank codes from you.

 

#2  What if someone fakes a trump video of him announcing a nuclear attack on north korea... it will roil the markets at least for 10 or 15 minutes.

 

#3   Using some big financial figure to micro target people into scams... Such as a deepfake ray dalio recommending a shitcoin ICO.  

In reply to by Bes

Implied Violins Thu, 01/25/2018 - 23:56 Permalink

To heck with PornHub, I'm just gonna start keeping the lube near when I'm on ZH.  I get all my news, porn, and humor here now.  All they need are some articles on beer and I'd never have to visit another site.