Scary new deepfake app can turn you into a pornstar without your consent

When deepfake technology was first introduced, the awe of what it could do was quickly replaced by the horrifying realities about what else could be made possible using the technology.

Deepfake technology can be applied in fun and innocuous ways, like it has been by this Kevin Hart look-alike from Zambia or the ReFace app, but it can also take a very dark turn.

What is a deepfake? The term refers to technology that allows one to create false videos showing a person looking a certain way or saying certain things. The technology allows the user to superimpose someone’s face onto someone else’s face in a way that looks very convincing.

According to a report published on MIT Tech Review, this technology has been applied on a new platform that promises to allow anyone the ability to turn anyone into a porn star by using deepfake technology. The platform in question is set up in such a way that it allows the user to swap any person’s face into an adult video. All it requires is the picture and the push of a button.

The advent of this website that various reports have chosen not to name in order to avoid giving it a larger user base has raised a number of questions around ethics in digital innovation.

When it is not being used to attribute false statements to politicians and other polarising public figures, much of the deepfake technology in use is used to create non-consensual porn primarily featuring women. 

Advertisement

This video is no longer available.

MIT Tech Review reports that the original Reddit creator who popularised the technology face-swapped female celebrities’ faces into porn videos.

ALSO READ: Anti-revenge porn law added to Cybercrimes Bill

Advertisement

It quickly became a favourite among people, mostly men, who wished to view certain women in a sexual manner. It has also become a way for people to attack others through creating content that can later be used as revenge porn

The code to make such a website still exists in open-source repositories online and this allows such websites to not only resurface but proliferate. 

“To this day, the research company Sensity AI estimates, between 90% and 95% of all online deepfake videos are non-consensual porn, and around 90% of those feature women.”

Advertisement

Understandably, there are concerns about the psychological impact of this tech, as well as the implications it can have on the lives of those who fall victim to having their face superimposed onto an explicit video they never made. 

It is worth noting that deepfake technology is not something that most people would know about or even understand. As a result, it would then be easy to believe that what they are watching is true, if they were to ever come across a deepfake starring someone that they know. 

Pornographic images and videos are notoriously difficult to remove from the internet, even with successful legal intervention. So, if this were to happen to someone, it would most likely exist for a long time and run the risk of being circulated widely the longer it is left on the internet.

Advertisement

It could come up in background searches of any kind and impact the life of the victim in far-reaching and unimaginable ways. 

There is currently very little recourse for victims and that makes this type of innovation very concerning. 

For more news your way

Download our app and read this and other great stories on the move. Available for Android and iOS.

Published by
By Kaunda Selisho