GitHub is banning code from DeepNude, the app that utilized AI to produce fake nude photos of ladies. Motherboard, which initially reported on DeepNude last month, validated that the Microsoft-owned software development platform won’t enable DeepNude jobs. GitHub informed Motherboard that the code breached its rules versus “sexually obscene material,” and it’s removed several repositories, including one that was officially run by DeepNude’s developer.
DeepNude was initially a paid app that created nonconsensual nude pictures of women using technology comparable to AI “deepfakes.” The advancement group shut it down after Motherboard’s report, saying that “the probability that individuals will misuse it is too high.” However, as we kept in mind recently, copies of the app were still accessible online– including on GitHub.
Late that week, the DeepNude group followed match by uploading the core algorithm (but not the actual app interface) to the platform. “The reverse engineering of the app was currently on GitHub. It no longer makes sense to conceal the source code,” composed the group on a now-deleted page. “DeepNude uses an interesting technique to resolve a common AI issue, so it could be helpful for scientists and designers working in other fields such as fashion, cinema, and visual impacts.”
GitHub’s standards state that “non-pornographic sexual material might belong of your task, or may exist for academic or artistic functions.” However the platform prohibits “pornographic” or “obscene” material.
DeepNude didn’t invent the principle of phony naked images– they’ve been possible through Photoshop, among other techniques, for decades. And its outcomes were inconsistent, working best with images where the topic was currently using something like a swimwear. But Motherboard called them “passably realistic” under these scenarios, and unlike Photoshop, they might be produced by anybody without any technical or artistic ability.
Politicians and analysts have raised alarm about deepfakes’ possible political impact But the innovation began as a way to produce fake, non-consensual porn of females, and like those deepfakes, DeepNude pictures primarily threaten females who could be harassed with fake nudes. A minimum of one state, Virginia, has organized using deepfakes for harassment alongside other kinds of nonconsensual “vengeance pornography.”
None of this can stop copies of DeepNude from appearing online– however GitHub’s decision could make the app harder to find and its algorithm harder to tinker with.