“When we look closely at the abuse image, most of the tools and weapons used are from the open source space,” Ajder said. But they often start with well-intentioned developers, he said. “Someone does something they think is interesting or cool and someone with bad intentions recognizes its destructive potential and uses it.”
Some, like the repository that was disabled in August, have purpose-built communities around them for explicit use. The model positioned itself as a tool for deeply fake pornography, Ajder said, which became a “funnel” for abuse, often targeting women.
Some videos are uploaded to porn-streaming sites by an account credited to AI models downloaded from GitHub showing the faces of popular deepfake targets, celebrities Emma Watson, Taylor Swift, and Anya Taylor-Joy, as well as other less famous but very real women, superimposed on sexual situations.
The creators freely describe the tools they use, including two that have been scrubbed by GitHub but whose code remains in other existing repositories.
The perpetrators of the search for deepfakes have gathered in many places online, including the secret forums of the Discord community and in plain sight on Reddit, which are mixed with deep attempts at prevention. A Redditor offered their services using the archive repository software on September 29. “Can someone do this for my cousin,” asked another.
Torrents of the main repository that was banned on GitHub in August are also available in other corners of the web, which shows how difficult it is to polish. open-source deepfake software across the board. Other deep pornography tools, such as the DeepNude app, have been developed equally taken before new versions appeared.
“There are so many models, so many different forks of models, so many different versions, it can be difficult to keep track of them all,” said Elizabeth Seger, director of digital policy at the cross-party UK think tank Tank Demos. “When a model is made open source publicly available for download, there is no way to do a public rollback of that,” he added.
A deeply fake pornography creator with 13 manipulated explicit videos of female celebrities has credited a prominent GitHub repository marketed as an “NSFW” version of another project that encourages of responsible use and clearly asks users not to use it for nudity. “Learn all available Face Swap AI from GitHUB, not using online services,” their profile on the tube site says, boldly.
GitHub already disabled this NSFW version when WIRED identified the deep videos. But other repositories marked as “locked” versions of the model were available on the platform on January 10, including one with 2,500 “stars.”
“It’s very true that once (the model is in place) it’s irreversible. But we can still make it harder for people to access it,” Seger said.
If left unchecked, he added, the potential harm of deep “porn” is not only psychological. Its effects include the intimidation and manipulation of women, minorities, and politicians, as seen in political deepfakes affecting women politicians around the world.
But it’s not too late to control the problem, and platforms like GitHub have options, Seger said, including intervening at the point of upload. “If you put a model on GitHub and GitHub says no, and all the hosting platforms say no, for a normal person it’s harder to get that model.”
Preventing deep pornography made using open source models also depends on policy makers, tech companies, developers and, of course, creators of abusive content themselves.
At least 30 US states also have some law that specifically addresses pornography, including bans, according to the nonprofit Public Citizen’s. Law trackeralthough definitions and policies vary, and some laws only include minors. Deep counterfeiters in the UK will also soon feel the force of the law after the government announced criminalize the creation of sexually explicit deepfakesas well as sharing them, on January 7.