GitHub Removed Open Source Versions of 'Deepfakes' Porn App DeepNude (vice.com) 178
An anonymous reader quotes a report from Motherboard: GitHub recently removed code from its website that used neural networks to algorithmically strip clothing from images of women. The multiple code repositories were spun off from an app called DeepNude, a highly invasive piece of software that was specifically designed to create realistic nude images of women without their consent. The news shows how after DeepNude's creator pulled the plug on his own invention late last month following a media and public backlash, some platforms are now stopping the spread of similar tools. "We do not proactively monitor user-generated content, but we do actively investigate abuse reports. In this case, we disabled the project because we found it to be in violation of our acceptable use policy," a GitHub spokesperson told Motherboard in a statement. "We do not condone using GitHub for posting sexually obscene content and prohibit such conduct in our Terms of Service and Community Guidelines."
The "Sexually Obscene" section of GitHub's Community Guidelines states: "Don't post content that is pornographic. This does not mean that all nudity, or all code and content related to sexuality, is prohibited. We recognize that sexuality is a part of life and non-pornographic sexual content may be a part of your project, or may be presented for educational or artistic purposes. We do not allow obscene sexual content or content that may involve the exploitation or sexualization of minors."
The "Sexually Obscene" section of GitHub's Community Guidelines states: "Don't post content that is pornographic. This does not mean that all nudity, or all code and content related to sexuality, is prohibited. We recognize that sexuality is a part of life and non-pornographic sexual content may be a part of your project, or may be presented for educational or artistic purposes. We do not allow obscene sexual content or content that may involve the exploitation or sexualization of minors."
This is why (Score:5, Insightful)
Re: (Score:1, Troll)
Are you living in a fascist country where you have to give all your social media aliases to an employer? Oh, wait, you probably do. My condolences.
Re: (Score:2, Insightful)
Jumped the shark on this one gweihir. There's many global employers that expect to know prospective employee's social media accounts. You are free to give them or to walk away. That does not make every country on the planet with a technology sector a fascist state.
Re: (Score:1, Insightful)
Correct they're not Facist. Facism is authoritarian nationalism.
But they are authoritarian. The existence of employers is authoritarian in itself. Capitalism is an authoritarian system, and would collapse without authoritarianism.
The working classes are slaves to the capitalist class as a whole. You are free to choose which master you slave for, but you can't chose not to slave.
And since they all pay the same shit wages, and have roughly the same authoritarian policies, it doesn't really matter which slave-
Re: (Score:1)
BS commie. You can absolutely choose not to slave. You just don't work.
You're mixing it up with your favorite system which is your entire soul, body and mind is owned by the state and you are forced to work or are sent to the gulag. Oh yeah and then you are forced to work.
So you have it backwards boris.
Re: (Score:3)
Where I live, you are allowed to lie on questions like that and if they find out, any retaliation would be a criminal act. They are also not allowed to ask, so they better make sure to only ask people they are sure they are going to hire. But then asking makes no sense. So nobody does it. They can restrict whatever you are allowed to post under your clear-name, but that is it.
Re: This is why (Score:1)
Sure it is... as long as you don't need to eat or live indoors.
Re: (Score:2)
I did not. You are living in fascism without even noticing. In countries with actual privacy laws, an employer is prohibited from even asking. They can have a social media policy here, but as soon as you post with a pseudonym, the maximum they can ask is that you do not post anything criminal. And they certainly may not ask what your pseudonyms are.
Re: (Score:2)
Note to those that moderated me down: The worst enemies of freedom are happy slaves.
Re: (Score:1)
Yup, logged out and shared the removed source code on IPFS:
deepnude_official-master.zip:
https://ipfs.globalupload.io/QmWxaRTnXNHCEjHjXKoPbwT6dQKS3MWefoGgo3BPzBButN
Sounds like it could be the killer app for... (Score:5, Funny)
Re: (Score:2)
Yeah, because being titillated at work sounds great...
Re: (Score:2)
What next for some CoC? (Score:4, Insightful)
Re: (Score:2, Informative)
The wrong math?
Isnt that what this story is about? The project was doing the bad maths.
All this modern "A.I." is thus: Take a series of inputs, do some maths, calculating a series of outputs.
Show me the loop from the outputs and inner nodes back to the inputs and then I may start calling it "thinking" and if emergent enough I might even stop calling it "maths", and no, feeding the last frame of video back to some of the "inputs" doesnt qualify as a "thinking loop" because it can be expressed without the "feedback" us
Re: (Score:2)
Are any of those things remotely comparable to involuntary pornography?
You can argue that crypto is used for illegal activity, but it has many substantial legitimate uses. DeepNude has only one function: removing clothing from people without their permission.
GitHub... Microsoft, right? (Score:1)
Obviously they have to censor shit now.
May I assume there are other suppositories for this software?
Re:GitHub... Microsoft, right? (Score:5, Funny)
There's one right up your ass
Open source is not software freedom (Score:5, Insightful)
It looks like this is how Microsoft shows it loves "open source". And the open source development methodology does not (apparently) stand for software freedom (the freedom to run, inspect, share, and modify published computer software) or freedom of speech. GitHub decided that this speech is "sexually obscene" according to GitHub's representative even though the program itself contains no "sexually obscene" information.
Re: (Score:1)
an app called DeepNude, a highly invasive piece of software that was specifically designed to create realistic nude images of women without their consent.
Indeed, how exactly does one design an "app"(program) that contains any idea of consent? Does it refuse to work if you obtained the consent of someone or consent yourself using your own pictures? "I'm sorry Dave, I only work without consent".
This is more about how potential users MIGHT use the program, rather than how the program works.
Re: (Score:1)
Re: (Score:2, Insightful)
It looks like this is how Microsoft shows it loves "open source". And the open source development methodology does not (apparently) stand for software freedom (the freedom to run, inspect, share, and modify published computer software) or freedom of speech. GitHub decided that this speech is "sexually obscene" according to GitHub's representative even though the program itself contains no "sexually obscene" information.
So, most OSINT projects on github fall in to that category. They can be used to invade someones privacy without "their consent". Lets ban them too.
Re: Open source is not software freedom (Score:2)
In fairness, GitHub went to hell long before Microsoft bought them. Their former ideals of freedom and meritocracy died the day they accepted the first dollar of funding from the VC cabal.
Re: (Score:1)
Wow. I am in awe.
You have an application here whose express and sole purpose for existence is to sexually harass women (and theoretically men too...). There is literally no good reason for such a software to even exist, OTHER than to sexually harass.
Leave it to the Slashdot Incel Lobby to be outraged that they are now allowed access to software that makes it trivial to create sexually explicit imagery without the persons consent.
Stop trying to use open source outrage as a cloak to cover up your mental il
unclear to me... (Score:3, Interesting)
the only thing wrong with it is that it provides self-proclaimed do-gooders a reason to remind us why they're better at deciding right and wrong
Re: (Score:3, Insightful)
Re: unclear to me... (Score:3, Interesting)
GitLab is owned by the Surveillance Valley VC cabal. Its hosted service is encumbered with a Nazi CoC, just like GitHub.
Re: (Score:2)
You could argue that taking a photo up someone's skirt, or that installing a camera in the toilet bowl, or hacking their iCloud account for "personal use" is not harming them... But they would probably disagree.
Re: (Score:2)
if for personal use only, what's wrong with it?
Actually that applies to all laws since you haven't considered any externalities as to why laws against them exist in the first place.
if a tree falls, and you're too stoned to notice (Score:2)
Rumour has it that long term use of some of those damage one's ability to think clearly—including the naive supposition that "personal use only" is a surgical smart bomb that never emits any shrapnel into the surrounding community.
A particularly pernicious form of shrapnel are these loud, bewildered denials that this shrapnel even exists.
Re: (Score:2)
Prostitution very often - though not always - involves some kind of abuse of the prostitute to place her/him in thProstitution very often - though not always - involves some kind of abuse of the prostitute to place her/him in that position.at position.
You have definitely never a met a real prostitute. Of that I am certain. It's just an easy way to make a lot of money. If guys could do it we would.
Re: (Score:3, Insightful)
Snails and Turtles (Score:2)
Have a shell!!
But since when do we consider NUDES to be OBSCENE,
Go tell your wife, she looks obscene when shes nude, you will get your dick cut off!!!
If it made all girls look ugly with scars and blisters - yeah thats obscene
"Highly invasive"? (Score:5, Insightful)
What did this program supposedly invade? My understanding was that this was just a fancy way of photoshopping genetalia onto clothed photos. Are they saying it steals your keystrokes and browsing history, or do they think that this program somehow divines exactly what a person looks like nude, and uses that to create the images?
Re: (Score:1)
Re:"Highly invasive"? (Score:5, Interesting)
It is basically that people think "this can show everybody nude" (it cannot). Then the fear sets in and the two brain cells the average person has available on a good day shut down. The whole thing was a nice demonstration of what can be done with the technology. The reason it is done for women only is because there is far more training material around, which just happens to be a fact. Hence even the "SEXISM!"-angle does not work. Sure, the whole thing is a bit silly, but that is basically it.
Re: (Score:2)
It's often used for bullying. Photoshopping people nude or into porn and then distributing the images is popular with the kids these days. And some adults.
Github probably looked at it, couldn't see any potential use that wasn't some kind of abuse (involuntary pornography is basically the only reason for it to exist) and decided they didn't want to be associated.
Re: (Score:2)
It is also unavoidable, unless you know a method to get rid of all the pricks in the human race? And mind, about half of those are female.
Now, I do agree that the author of this thing may have gotten the balance between creating attention and creating outrage at his creation wrong, but the resulting shit-storm is entirely missing the point.
Re: (Score:2)
The reason it is done for women only is because there is far more training material around, which just happens to be a fact.
That was the excuse the author gave. It is also not true. There's plenty of training material around for men. But that might get some of Teh Gay (tm) on him.
Re: (Score:2)
It is true. Stop lying.
Re: (Score:2)
Right... because out of all the INNUMERABLE possible applications this type of software could be designed for, the obvious thing to do is make something that can sexually harass people.
The original author showed unbelievably poor judgement in making something like this available to the public. Only a complete and utter fool could possibly have thought that a tool like this wouldn't immediately be abused by the lowest denigrates of society, and that's giving the benefit of the doubt.
You can rationalize, mak
Re: (Score:2)
And there it is, the full-on panic mode, with all rationality erased. Thanks for the demonstration.
You are aware that people have been creating fake nude and even porn pictures for ages, right?
Re: (Score:2)
It is also very bad at it.
The whole thing is just a bait, and it worked EXACTLY as intended.
Re: Nobody cares faggot, you're just dumb. (Score:1)
Social Just-Us Nazis sure do love state-sponsored rape and torture.
Re: (Score:1)
Comic Book Code all over again... (Score:5, Insightful)
Thee is NOTHING wrong with this project- indeed it is achieving an early attempt for a method that will be universal 10 years from now. What we have today is the same type of pathetic 'moral' scare that killed freedom in comic books for more than three decades. Today Comic Books have the very forms of content (Walking Dead etc) that the Comic Code was explicitly set up to extinguish.
Britain is seriously trying to ban ALL sharp knives under the same type of moral panic.
In the near future it will be simple to carry out a task like faking the removal of clothes on a photograph, and NOT because the tech is about this one particular example, but because the general idea (using massive databases of existing images with AI algorithms) will be trivial and universally available.
The only way to hold back the tide will be a level of DRACONIAN law enforcement the modern West has never yet witnessed- with societal shockwaves infinitely worse than the 'humiliation' of 'victims' of fake images.
Today we accept actions in FAKE narratives (movies) that we'd never want to see in the real world- and we allow this because fantasy is not the same as reality. So a kid's movie can contain unthinkable amounts of 'fantasy' violence, and most people don't bat an eyelid.
When the printing press was invented, the exact same type of moral panic that seeks to ban 'fake nudes' was weaponised by the state to ban access to the printing press in much of the world. I mean, what would happen if anyone was allowed to write and distribute their words without prior state/church censorship. Today, we see the BBC et al ban comments on their website using the exact same argument.
Moral panics are always evil- and purposely so. Always really a tool of the state that predicates on the idea that the sheeple can not be trusted. But in this case things are at a worse state, for the logical conclusion of the moral panic is to remove general computin gfrom the general population. In other words, the sheeple must have programmable machines taken from them, and only allowed to own computers that run pre-approved software from walled gardens.
The big tech companies (with the approval of all the regulars on Slashdot) want this. No 'wild' code allowed. All 'wild' code pre-described as "too dangerous to permit".
There are two futures...
1) deep fakes etc become a universal and non-important ability of machines most own.
2) general programmable computers are BANNED for ordinary ownership, and all generally available code is subject to extreme vetting by forces of the state.
There is no third option. Either computer freedom (where whatever the power of computers makes possible just happens) or no computer freedom, where a dystopian restriction on computers and coding is applied.
Re:Comic Book Code all over again... (Score:5, Insightful)
People have been saying general purpose computers would be banned since RMS published his "right to read" in the 90s; then the RIAA/MPAA campaign to end piracy raged for the last couple decades, yet here we are.
Kind of in the the middle -- copyright still exists and has plenty of teeth despite general purpose computers that can violate it at will coexist with it, and nobody really expects it to change. Sure we've got DRM and HDCP to restrict us, and cracks for DRM and HDCP and workarounds to get around it. Intel shows up with IME; someone else shows up with open source hardware initiatives.
The pragmatic among us would see plenty of victories and losses for both sides.
And we persist in this balancing act, never reaching and never likely to reach either extreme.
aka... "the 3rd option"
Re: (Score:2)
And we persist in this balancing act, never reaching and never likely to reach either extreme.
aka... "the 3rd option"
Which is exactly the point that RMS was making in his story: If we let down our guard, we end up as slaves to other peoples' interests.
It's not a victory if the result is that you have to keep on fighting. It's a victory when the war is over and you can relax. As long as you're there with your gun in your foxhole, you have resisted defeat, but you haven't won.
Re: (Score:2)
I'd say you aren't looking hard enough.
When RMS wrote "right to read" in the 90s there wasn't really an open hardware movement at all, and he has called many times over the years that his options to run his free software on freedom respecting hardware due to closed firmware etc was extremely limited, and subject to compromises.
Nowadays we have a all kinds of awareness about it, and several groups are working to create open hardware platforms, open firmware, and even open hardware designs down to the silicon
Re:Comic Book Code all over again... (Score:4, Funny)
Comic books? They sold X-rays glasses in the backs of those.
Re: (Score:1)
I don't want to defend the UK's censorship, but I have to point out that
"Britain is seriously trying to ban ALL sharp knives under the same type of moral panic."
and
"Today, we see the BBC et al ban comments on their website"
are not true. The BBC has comments on stories every day and knives are not being banned.
Re: (Score:2)
Thee is NOTHING wrong with this project
I'll be distributing pornography featuring you and a subordinate at your workplace. Good luck with HR not firing you for sexual harassment, since they now have proof you were fucking him.
Also, if you happen to have a spouse, I'll be giving her a copy. Good luck with the divorce.
In other words, there is PLENTY wrong with involuntary pornography.
Invasie? (Score:3)
"...a highly invasive piece of software..."
How can this software be called 'invasive?' In and of itself the software isn't invasive; it doesn't act on its own accord. It requires a user. As such, it's use may be invasive, but one can't blame the software for how it is used. Such claptrap.
Re:Invasie? (Score:4, Insightful)
Re: (Score:1)
They want to be laid badly, but hate men and are not quite lesbians. Feminists are female incels.
Re: (Score:1)
In the same way that a telephoto lens is invasive. It's a tool designed to facilitate invasion of privacy.
At least there are some uses for a telephoto lens that are not about invading people's privacy. DeepNude has only one purpose.
Re: (Score:2)
Can't tell if original Slashdot post or rap lyrics.
Re: (Score:2)
Re: (Score:2)
For what crime exactly??
pornographic code? (Score:2, Insightful)
in what way was the code pornographic? or, did the repo also include training data?
are we going to raze the libreoffice repos because it could be used to create porn?
Re: (Score:2)
Put it on a tshirt.
Didn't we do this the last time people tried to ban math because "bad people" could abuse it?
I personally see this as a boon for privacy and plausible deniability. It allows the "poison the well" approach, as we've seen several browser/extensions tryingout lately.
Firefox and Chromium? (Score:3, Insightful)
What about Firefox and Chromium? I've been told that those softwares are often used to view porn degrading to women and read FOX News. Better ban those too, GitHub!
I am confused? (Score:1)
I meant GitHub has the right to kick any project off its server. Its server is its property afterall. Neither do I care about the project existance, but using reasoning as vague as this? I mean, does the code stored on the GitHub server contain any nudity, or any pornographic material? This does sound a lot like Apple in the early day trying to prohibit certain app, but give a reason that any web browsers, including its own, wouldn't pass either. I'm not sure anyone ever try this, but what if this app
Re:I am confused? (Score:4, Insightful)
Re: (Score:2)
Of course they removed it. (Score:3)
Re: (Score:2)
The sexist angle is just a whole bucket of frosting on top of the already large pervert cake.
We're talking about a piece of software whose express and sole purpose is to create sexually explicit imagery of people without their consent. Even if it wasn't specifically targeting women, it is still a very distasteful piece of software whose sole purpose of existence is to empower and encourage wierdos.
Re: (Score:2)
Make it work on both male and female, and there wouldn't be such a stink.
Have you met the human race?
Lucky (Score:1)
Re: (Score:1)
What is github's responsibility? (Score:2)
Is Github making themselves responsible for possible uses of applications who's code is hosted on their site? While many of the applications of this code could be viewed as "bad", where is the dividing line? Any sufficiently advanced picture or video editing application can be used in extremely nefarious ways.
Of course its gets more complicated if the original author wants it removed - but that opens yet another can of worms for open source rights.
There needs to be a dividing line somewhere - there is a l
Streisand (Score:2)
They did hear about the Streisand Effect, didn't they? I wasn't interested in this type of software, now I am. And I'm absolutely certain copies are out there, somewhere. Had I by accident found one yesterday, I would've shrugged and moved on. Today, I would probably download a copy, try it on some pictures of my wife, and see how realistic it really is. :-)
Hey GitHub. The point? (Score:2)
Yeah. You missed it.
Sure, it's maybe going to be "obscene" content, whatever that means, but who cares. The *problem* is that it's potentially going to be content, whether sexually graphic or not, that puts the faces of people not involved in filming pornographic videos in a position where people will believe that they *were* in pornographic videos.
Be all puritanical or whatever, but obscenity isn't the problem that this or any other "deepfake" tool is actually causing. I'd be satisfied if any such tool
Stressand Effect (Score:2)
They tried to ban the packs of stolen nude pics from The Fappening and its encores but only managed to spread them more.
I suspect the same will happen to this application. The more bans the more publicity and thus more demand. Not the smartest idea.
dike porn (Score:2)
Dike porn: when you know you can't possibly win the war against the gathering ocean of human depravity, but you don't wish to be seen as the first person to remove your thumb to let the formerly stalwart but now entirely make-believe dike of decorum crumble.
Re: (Score:2)
Possibly the model data which was trained on lots of private parts, and thus contains them in some form.