It Was Only a Matter of Time Before Internet Trolls Made More Sophisticated Fake Porn Videos

in #fake7 years ago

Fake porn involves manipulating a video or photo by putting someone else’s face on a porn star’s body. In recent months, a growing group of Reddit users have used machine learning algorithms to swap celebrities’ faces into porn scenes. And now it seems to have entered the grossest, and most personal, phase yet.

While the latest development in fake porn videos is particularly disturbing, it’s unsurprising in hindsight, a predictable culmination of just about every problem that currently exists online. It’s an amalgamation of toxic online communities, gross invasions of privacy, the abuse of online manipulation tools, and revenge porn.

Fake porn isn’t new: People have been exploiting exes and celebrities online for years using photo-editing software like Photoshop. But now more powerful (and free) tools using machine learning allow users to create more realistic fake footage so long as they have a few hours of spare time and enough images of their victim.

It’s not as simple as downloading an app or web tool and then uploading your victim’s profile photo you ripped from Facebook or Instagram—a user needs a bunch of photos of the targeted individual. To collect enough images, a user needs to use an open-source photo-scraping tool, which can grab photos of someone that are publicly available online. They can then use free web-based tools to find a porn star match to their victim.
What we are seeing isn’t a completely new practice, but rather the predictable consequence of shitty people with access to more sophisticated technology. It’s emblematic of a continued practice that, thanks to AI, can continue to thrive in the way it was always destined to.

Coin Marketplace

STEEM 0.21
TRX 0.20
JST 0.034
BTC 91785.29
ETH 3117.82
USDT 1.00
SBD 3.00