The Deepfake ‘DeepNude’ app turns off after fury

The creators of a deepfake application that allows users to “strip” women virtually using artificial intelligence, interrupted it after an uproar of social media over its potential for abuse.

The creators of “DeepNude” said the software was launched several months ago for “entertainment” and that they “strongly underestimated” the app’s demand.

“We never thought it would be viral and (that) we wouldn’t be able to control the traffic,” the creators of DeepNude, who listed their position as Estonia, said on Twitter.

“Despite the security measures taken (watermarks), if 500,000 people use it, the likelihood that people will misuse it is too high. We don’t want to make money that way.”

Articles in The Washington Post, Vice and other media showed how the app could be used to take a picture of a dressed woman and turn it into a naked image, sparking outrage and renewed debate on unconventional pornography.

“This is a horribly destructive invention and we hope to see you soon as a victim of consequences for your actions,” the Cyber ​​Civil Rights Initiative, a group seeking protection against unconventional porn and “revenge” tweeted.

Mary Anne Franks, professor of law and president of the CCRI, wrote on Twitter: “It is nice that it has been closed, but this reasoning does not make sense. The INTENDED USE of the app was to accommodate the predatory and grotesque sexual fantasies of men pathetic. “

DeepNude offered a free version of the application and a paid version and was the latest in a “deepfake” technology trend that can be used to deceive or manipulate.

Although the app has been closed, critics have expressed concern that some versions of the software remain available and would be abused.

“The #Deepnude app is now available and will be used, despite the creator taking it off the market. If only there was a way to disable all versions out there,” CCRI tweeted.

Leave a Reply

Your email address will not be published. Required fields are marked *