AI deepfakes are terrifyingly easy to make

Back in 1996, at age 10, I played a computer game at a friend’s house called Spycraft: The Great Game. You play the role of a CIA operative in an investigation into an assassination plot. To mislead a suspect, during interrogations, you can doctor a photo. The process blew my 10-year-old mind — so much so that I’ve remembered how powerful that minigame felt, all these years. Although blurry and pixelsy, it still looked great in the photo editor. Spycraft It was very similar to what Adobe Photoshop would become. It felt like high-tech spying and trickery in 1996. In 2023, it’s utterly mundane. It isn’t difficult or expensive to alter a photograph — not anymore. You can alter any photograph. We have learned to trust every image that we see.

Deepfake technology has already proven that we can’t trust video or audio recordings, either. This is made possible by the availability of generative artificial Intelligence, which has made making deepfakes much easier. We all need to get used to this new reality — and fast.

Genna Bain, the wife of the now-deceased YouTuber John “TotalBiscuit” Bain, posted on Twitter last week about a new concern she faces thanks to the advancements of AI tech: “Today was fun. Being faced with making a choice of scrubbing all of my late husband’s lifetime of content from the internet. Apparently people think it’s okay to use his library to train voice AIs to promote their social commentary and political views.” In response, she received sympathy and pleas from her husband’s fans to preserve his online legacy.

But here’s the problem. There’s no practical way that Genna Bain, or anyone else in her position, could adequately prevent anyone from creating a deepfake video or audio clip of John Bain. A few minutes worth of audio is all that’s needed to teach an AI how to sound like a person. However, for video deepfakes you will need to have footage from multiple angles and expressions. So, if you wanted to prevent yourself from ever appearing in a deepfake, you’d need to delete every single visual and auditory record of your existence, which for anyone who uses a smartphone is so close to impossible that it may as well Be impossible. That’s even more true for a public figure like Bain, who guested on shows and podcasts that his wife doesn’t necessarily have the ability to remove, and whose face and voice have also already been saved forever on the hard drives of his fans around the world.

In the 1990s and 2000s, Photoshop made it possible for people to paste celebrities’ faces onto other people’s naked bodies, and in 2018, the public learned about how AI tech could be used to make video pornography that appeared to depict celebrities. The tech has become easier to access. Googling “free deepfake app” will deliver tons of options for editing software. While this technology is still being used to make porn for celebrities in 2023 as well, it’s also being used by people to make goofy statements about celebrities. This track is a mix of memes and porn. The internet runs on both.

You will become dehumanized and objectified if you are famous enough. Your fans will also be shocked and confused when you try to stop it. You also won’t be able to stop it. But this isn’t an article where I try to convince people to feel sorry for famous people. This is a losing battle that I sometimes fight, but it’s one that I do try. Instead, this is an article in which I attempt to convince people to not trust what they hear and see.

We all had to take a while to adjust to Photoshop’s existence. Seeing an image that has been faked in a clever way can still mislead an intelligent, reasonable person into believing something that’s not true. It’s human nature to want to believe in something that looks real; after all, seeing is believing, right? All that said, I lived through the rise of Photoshop, and that’s why I believe we will adjust to this, too.

I don’t know what the future looks like, or what types of regulations we’ll need in order to address this situation. One thing I know is that it’s already here. It is now possible to fake this kind of trickery, and it’s also very easy. We now have to accept that it’s here and move forward into a reality where our skepticism expands to include even more types of trickery.

But, hey! the memes are going to be great.

#deepfakes #terrifyingly #easy