Fri. May 16th, 2025
    deep fake
    image 37

    If you were paying attention to tech news last week, reports of a disturbing deep fake video rang across the interwebs. In the video, “actor Steve Buscemi’s face is seamlessly molded onto Hunger Games star Jennifer Lawrence’s head”. You can watch the video here.

    1549191839925?e=1668643200&v=beta&t=9k m18zC TxWS3mXB9OoO7BsRrYSEzwnBk ljX2KmA4
    1549191929621?e=1668643200&v=beta&t=0p8nBNlrqKEhMc7ZeF3d0GznUnb6BfWmjpQ7ro eGnU

    A deep fake is defined as:

     an artificial intelligence-based human image synthesis technique. It is used to combine and superimpose existing images and videos onto source images or videos using a machine learning technique called a “generative adversarial network”

    A Generative Adversarial Network, GAN for short, in simple terms is an algorithm that is composed of two parts: i) a part that generates fake data (called a generator or forger) with the explicit aim of fooling ii) a part that tries to guess which data is fake and which is not (a discriminator). For a technical discussion, take a look at my blog where I explained GANs by generating handwritten digits. These two parts compete against each other: the generator will present data to the discriminator and if the discriminator correctly guesses when the data presented to it is fake, the generator updates the way it is creating the data and presents it again to the discriminator. The process goes on until the generator is able to fool the discriminator or some stopping criteria is reached. At that point, it becomes difficult to tell which one is real and which one is fake.

    Let’s take a look at this video of Barack Obama:

    Apart from what he is saying, which is uncharacteristic of his classy demeanour (because it was following comedian Jordan Peel’s video – just deep faked to put Obama instead), you can tell something is a bit off with expressions but like anything technological, give it a few years and we might start World War III because the declaration was done through a deep fake. Okay, probably not but I hope you get an idea of how these types of videos are definitely going to fuel fake news in the coming years.

    Another scarier example was published by the Washington Post on the 31st of December last year. Scarlett Johansson talked about the dangers of deep fakes when they are applied to explicit adult videos. There are some obsessed fans who are generating unsanctioned videos of her in compromising situations. She says it does not bother her as much as people expect it to but she is extremely worried about how devastating deep fake videos can be to other groups of society as we have seen deep fake videos being used to extort vulnerable women. “Everybody is a potential target” is part of the headline in another post.

    Can it get even more scarier? Unfortunately, yes because of the fact that it is becoming increasing easy for someone to download these algorithms, train them for a few hours or days and create something that can potentially end your life as you know it. We are used to trusting video evidence because doctoring it in a convincing manner required a certain level of skill. These A.I. algorithms, once coded correctly, will train and generate these videos all by themselves. One will be able to feed it a script and the algorithm will use your voice and face just like the Obama video above. Today, they are targeting celebrities and tomorrow it is you and I in the cross-hairs of unscrupulous individuals. “Everybody is a potential target.”

    Two questions need to be answered right now:

    1. How will we easily detect deep fakes?
    2. After we do so, what legal instruments can we use to police their generation and distribution?

    Some senators in the US are already suggesting legal ways to deal with the latter question. Platforms like Facebook would be held liable for allowing users to post deep fake videos on them especially when they have a negative impact on their victims. Even with his billions of dollars, I’m glad I am not Mark Zuckerberg right now!

    Deep fakes. Be afraid. Be very afraid.

    76 / 100 SEO Score