Deep Fakes May Be Deep Trouble
So a few days ago, I was watching one of my more favored Steemians video on Dlive and she brought up this thing called Deep Fakes. So that is the second time I heard about it, and then again today on one of my favorite YouTube channels. So using the logic of; first time is coincidence, second time is happenstance, third time is enemy action. Clearly I have write about this.
So what are Deepfakes, in a nutshell it’s a AI that uses a type of algorithmic logic to identify facial characteristics to map out a persons face and make it pasteable to video. This isn’t a sloppy cut and paste job, its very real looking. The latest news has been with actresses getting their faces put into porn videos. So I guess the next question is how long will it be before be Trump, Putin, Xi Jinping, Kim Jong Un, or some other character pop up in a video saying something that is either economically disruptive or sparks some sort hostility or mass hysteria? What is the plausibility of this actually happening? And has it already happened?
Apparently the technology is so good that it can do facial expressions. Likewise, the capability for voice duplication and manipulation make this a real issue. Imagine a video surfaces on the “no so credible” mainstream news media with world leader making accusations against another, as a precursor to inevitable war. How quickly would the world catch on that is was a hoax? Would it be fast enough to save the financial markets from plummeting or panic causing the grocery stores to sell out of everything on the shelves?
Photo manipulation isn’t something new, but video is and because its video, its more real to viewer. Fortunately, the tools exist for photos. I guess should probably get to work developing a detection tool for video manipulation. For the moment, we should be paying attention to the body type, bad lip syncing, things in the video that are just flat out crazy, and of course always question the source.