From eerily specific ads to lifelike robots, can technology and artificial intelligence get any more frightening? Apparently so. There’s a new threat in town, the rise of the deepfakes.

Deepfakes use a special technique for human image synthesis based on artificial intelligence. It ‘combines and superimposes existing images and videos onto source images.’ In better words, deepfakes are deceptive videos in which something occurs that, in fact, isn’t real. For example, several videos have circulated online where film actors have been replaced by Nicholas Cage.

However, deepfakes can be insidious in nature too. Earlier this week, Amsterdam-based company Deeptrace Labs published research on the strange phenomenon revealing terrifying truths. According to Deeptrace, deepfake videos have almost doubled in the last nine months. Research found 14,698 deepfake videos online compared to the 7,964 videos that were found in December 2018.

The few discussions currently surrounding the controversial topic is centred on their potential impact in politics and particularly election campaigning. However, research shows that politics is not where the concern should lie, at least not for now.

The Real Victims of Deepfakes

Apparently, 96% of deepfakes online are of pornographic content in which adult film stars are replaced with celebrities, all of which are women. The bizarre phenomenon has reached Britain, the United States and even South Korea. Deeptrace acknowledges that this technology is already being used against women in cases of revenge porn and could be used in cyber bullying as well.

The rapid growth of these videos is disturbing to say the least and reveals technology’s increasingly sinister side. However, the debate still seems to be around how it can be used against politicians without any thought to how it’s already affecting scores of women worldwide.

The Origins of Deepfake

The first mention of deepfakes was in 2017 on Reddit by the user of the same name. The user shared pornographic videos replaced with unsuspecting celebrities and a community on the website began to form. Last year, the Reddit user was banned from the site, but that unfortunately did not put a stop to things. While many websites have taken the step to ban the content, there are still many others such as 8chan and 4chan that continue to choose to allow the sharing of deepfakes.

With deepfakes quickly evolving and becoming easier to make, there is no way to say how it can be prevented. However, as the epidemic continues to rise significantly, there needs to be legislation put in place to protect the women who have already become victims.

Aisha Mohamed