WITNESS, an organization focused on the use of media to defend human rights, has expressed optimism around the technology when used in this way, while also recognizing digital threats. For instance, the 2020 HBO documentary " Welcome to Chechnya" used deepfake technology to hide the identities of Russian LGBTQ refugees whose lives were at risk while also telling their stories. That was not the first use of a deepfake to create misleading videos, and tech-savvy political experts are bracing for a future wave of fake news that features convincingly realistic deepfakes.īut journalists, human rights groups, and media technologists have also found positive uses for the technology. Trump never gave that speech, however – it was a deepfake. In 2018, for example, a Belgian political party released a video of Donald Trump giving a speech calling on Belgium to withdraw from the Paris climate agreement. According to a Deeptrace report, pornography made up 96% of deepfake videos found online in 2019.ĭeepfake video has also been used in politics. Since that time, porn (particularly revenge porn) has repeatedly made the news, severely damaging the reputation of celebrities and prominent figures. In 2017, a reddit user named "deepfakes'' created a forum for porn that featured face-swapped actors. Several apps make generating deepfakes easy even for beginners - such as the Chinese app Zao, DeepFace Lab, FakeApp, and Face Swap - and a large amount of deepfake softwares can be found on GitHub, an open source development community.ĭeepfake technology has historically been used for illicit purposes, including to generate non-consensual pornography. Though the process is complex, the software is rather accessible. The program guesses what a person looks like from multiple angles and conditions, then maps that person onto the other person in the target video by finding common features.Īnother type of machine learning is added to the mix, known as Generative Adversarial Networks (GANs), which detects and improves any flaws in the deepfake within multiple rounds, making it harder for deepfake detectors to decode them. The videos can be completely unrelated the target might be a clip from a Hollywood movie, for example, and the videos of the person you want to insert in the film might be random clips downloaded from YouTube. You first need a target video to use as the basis of the deepfake and then a collection of video clips of the person you want to insert in the target. There are several methods for creating deepfakes, but the most common relies on the use of deep neural networks that employ a face-swapping technique. "You have a lot of AI assistance with CGI, but at the end of the day there is a human with a human viewpoint controlling what the output is going to be," López said. Note: Computer-assisted technologies like Photoshop and CGI are commonly used to create media, but the difference is that humans are involved in every step of the process. When it comes to deepfakes, the user only gets to decide at the very end of the generation process if what was created is what they want or not outside of tailoring training data and saying "yes" or "no" to what the computer generates after the fact, they don't have any say in how the computer chooses to make it. What separates a deepfake is the element of human input. The AI-generated pope in a puffer jacket, or the fake scenes of Donald Trump being arrested that circulated shortly before his indictment, are AI-generated, but they're not deepfakes. "A deepfake would be footage that is generated by a computer that has been trained through countless existing images," said Cristina López, a senior analyst at Graphika, a firm that researches the flow of information across digital networks.ĭeepfakes aren't just any fake or misleading images. The term "deepfake" comes from the underlying technology - deep learning algorithms - which teach themselves to solve problems with large sets of data and can be used to create fake content of real people. Deepfakes use AI to generate completely new video or audio, with the end goal of portraying something that didn't actually occur in reality.
0 Comments
Leave a Reply. |