×

Registration

Profile Informations

Login Datas

or login

First name is required!
Last name is required!
First name is not valid!
Last name is not valid!
This is not an email address!
Email address is required!
This email is already registered!
Password is required!
Enter a valid password!
Please enter 6 or more characters!
Please enter 16 or less characters!
Passwords are not same!
Terms and Conditions are required!
Email or Password is wrong!

Creepy DeepFake Nudie App Goes Viral Disrobing Women With AI, Then Shuts Down

Man Holding Smartphone
The developer of a horrifically disgusting 'DeepNude' application that offered to "undress" women using machine learning technology has shut down the operation following a backlash on social media. In a message posted to Twitter, the developer acknowledged that "the probability that people will misuse [the application] is too high."

How or why the developer may have ever thought otherwise is a mystery. In a lame attempt to justify the app's existence, the developer further stated that the software was created "for user's entertainment" and is "not that great, it only works with particular photos." Here is the full statement on why the app is no longer available...

DeepNude was built around pix2pix, an open source project developed two years ago by researchers at the University of California, Berkeley. It leverages machine learning through the use of a GAN, or a generative adversarial network, whereby two neural networks sort of compete with one another to generate a realistic looking image.

"The networks are multiple, because each one has a different task: locate the clothes. Mask the clothes. Speculate anatomical positions. Render it," the developer told Motherboard. "All this makes processing slow (30 seconds in a normal computer), but this can be improved and accelerated in the future."

DeepNude only works with pictures of women. Though the developer considers it to be a "slow" process, the effect is much quicker than manually manipulating a photo in a similar manner in Photoshop. It is also way faster than what it takes to generate a deepfake video.

The ramifications for something like this are both creepy and frightening. This is the sort of thing that can be used for revenge porn, whereby in this case a nude photo of a woman could be posted without her consent. Though the naked parts are not the actual ones, it hardly matters.

"Yes, it isn’t your actual vagina, but... others think that they are seeing you naked," Danielle Citron, a law professor at the University of Maryland Carey School of Law, told Motherboard. "As a deepfake victim said to me—it felt like thousands saw her naked, she felt her body wasn’t her own anymore."

Deepfakes are emerging as a real problem in an era of fake news. Even outside the realm of naked photos and videos, we have seen some disturbing uses of the technology. It is not just imagery, either—a recent demonstration played audio clips of what sounded like the voice of Joe Rogan, but were actually computer generated.

The DeepNude app represents one of the worst implementations for machine learning. Unfortunately, even though it has been yanked offline, the copies that are out in the wild still work. More over, the developer is probably correct that someone else would have (and still will) create a similar app.

Go to Source