Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Open Source The Internet

GitHub Removed Open Source Versions of 'Deepfakes' Porn App DeepNude (vice.com) 178

An anonymous reader quotes a report from Motherboard: GitHub recently removed code from its website that used neural networks to algorithmically strip clothing from images of women. The multiple code repositories were spun off from an app called DeepNude, a highly invasive piece of software that was specifically designed to create realistic nude images of women without their consent. The news shows how after DeepNude's creator pulled the plug on his own invention late last month following a media and public backlash, some platforms are now stopping the spread of similar tools. "We do not proactively monitor user-generated content, but we do actively investigate abuse reports. In this case, we disabled the project because we found it to be in violation of our acceptable use policy," a GitHub spokesperson told Motherboard in a statement. "We do not condone using GitHub for posting sexually obscene content and prohibit such conduct in our Terms of Service and Community Guidelines."

The "Sexually Obscene" section of GitHub's Community Guidelines states: "Don't post content that is pornographic. This does not mean that all nudity, or all code and content related to sexuality, is prohibited. We recognize that sexuality is a part of life and non-pornographic sexual content may be a part of your project, or may be presented for educational or artistic purposes. We do not allow obscene sexual content or content that may involve the exploitation or sexualization of minors."
This discussion has been archived. No new comments can be posted.

GitHub Removed Open Source Versions of 'Deepfakes' Porn App DeepNude

Comments Filter:
  • This is why (Score:5, Insightful)

    by Anonymous Coward on Tuesday July 09, 2019 @06:45PM (#58898720)
    This is why we have AC accounts, because posting in this thread could literally be a career killer.
    • Re: (Score:1, Troll)

      by gweihir ( 88907 )

      Are you living in a fascist country where you have to give all your social media aliases to an employer? Oh, wait, you probably do. My condolences.

      • Re: (Score:2, Insightful)

        by Anonymous Coward

        Jumped the shark on this one gweihir. There's many global employers that expect to know prospective employee's social media accounts. You are free to give them or to walk away. That does not make every country on the planet with a technology sector a fascist state.

        • Re: (Score:1, Insightful)

          by Anonymous Coward

          Correct they're not Facist. Facism is authoritarian nationalism.
          But they are authoritarian. The existence of employers is authoritarian in itself. Capitalism is an authoritarian system, and would collapse without authoritarianism.

          The working classes are slaves to the capitalist class as a whole. You are free to choose which master you slave for, but you can't chose not to slave.
          And since they all pay the same shit wages, and have roughly the same authoritarian policies, it doesn't really matter which slave-

          • by Anonymous Coward

            BS commie. You can absolutely choose not to slave. You just don't work.

            You're mixing it up with your favorite system which is your entire soul, body and mind is owned by the state and you are forced to work or are sent to the gulag. Oh yeah and then you are forced to work.

            So you have it backwards boris.

          • by gweihir ( 88907 )

            Where I live, you are allowed to lie on questions like that and if they find out, any retaliation would be a criminal act. They are also not allowed to ask, so they better make sure to only ask people they are sure they are going to hire. But then asking makes no sense. So nobody does it. They can restrict whatever you are allowed to post under your clear-name, but that is it.

        • by gweihir ( 88907 )

          I did not. You are living in fascism without even noticing. In countries with actual privacy laws, an employer is prohibited from even asking. They can have a social media policy here, but as soon as you post with a pseudonym, the maximum they can ask is that you do not post anything criminal. And they certainly may not ask what your pseudonyms are.

      • by gweihir ( 88907 )

        Note to those that moderated me down: The worst enemies of freedom are happy slaves.

    • by Anonymous Coward

      Yup, logged out and shared the removed source code on IPFS:
      deepnude_official-master.zip:
      https://ipfs.globalupload.io/QmWxaRTnXNHCEjHjXKoPbwT6dQKS3MWefoGgo3BPzBButN

  • .... augmented reality with something like Google glass.
  • by AHuxley ( 892839 ) on Tuesday July 09, 2019 @06:53PM (#58898750) Journal
    DRM? Crypto? Ad blocking? The wrong math?
    • Re: (Score:2, Informative)

      by Rockoon ( 1252108 )

      The wrong math?

      Isnt that what this story is about? The project was doing the bad maths.

      All this modern "A.I." is thus: Take a series of inputs, do some maths, calculating a series of outputs.

      Show me the loop from the outputs and inner nodes back to the inputs and then I may start calling it "thinking" and if emergent enough I might even stop calling it "maths", and no, feeding the last frame of video back to some of the "inputs" doesnt qualify as a "thinking loop" because it can be expressed without the "feedback" us

    • by AmiMoJo ( 196126 )

      Are any of those things remotely comparable to involuntary pornography?

      You can argue that crypto is used for illegal activity, but it has many substantial legitimate uses. DeepNude has only one function: removing clothing from people without their permission.

  • by Anonymous Coward

    Obviously they have to censor shit now.

    May I assume there are other suppositories for this software?

    • by Anonymous Coward on Tuesday July 09, 2019 @07:03PM (#58898800)

      There's one right up your ass

    • by jbn-o ( 555068 ) <mail@digitalcitizen.info> on Tuesday July 09, 2019 @08:04PM (#58899026) Homepage

      It looks like this is how Microsoft shows it loves "open source". And the open source development methodology does not (apparently) stand for software freedom (the freedom to run, inspect, share, and modify published computer software) or freedom of speech. GitHub decided that this speech is "sexually obscene" according to GitHub's representative even though the program itself contains no "sexually obscene" information.

      • by Anonymous Coward

        an app called DeepNude, a highly invasive piece of software that was specifically designed to create realistic nude images of women without their consent.

        Indeed, how exactly does one design an "app"(program) that contains any idea of consent? Does it refuse to work if you obtained the consent of someone or consent yourself using your own pictures? "I'm sorry Dave, I only work without consent".
        This is more about how potential users MIGHT use the program, rather than how the program works.

        • I don't need your permission to take your picture in public (internet == public, btw). I also don't need your permission to edit that picture. Don't like what I did? Too fucking bad. The price of freedom is people doing things you don't like.
      • Re: (Score:2, Insightful)

        by Anonymous Coward

        It looks like this is how Microsoft shows it loves "open source". And the open source development methodology does not (apparently) stand for software freedom (the freedom to run, inspect, share, and modify published computer software) or freedom of speech. GitHub decided that this speech is "sexually obscene" according to GitHub's representative even though the program itself contains no "sexually obscene" information.

        So, most OSINT projects on github fall in to that category. They can be used to invade someones privacy without "their consent". Lets ban them too.

      • In fairness, GitHub went to hell long before Microsoft bought them. Their former ideals of freedom and meritocracy died the day they accepted the first dollar of funding from the VC cabal.

      • Wow. I am in awe.

        You have an application here whose express and sole purpose for existence is to sexually harass women (and theoretically men too...). There is literally no good reason for such a software to even exist, OTHER than to sexually harass.

        Leave it to the Slashdot Incel Lobby to be outraged that they are now allowed access to software that makes it trivial to create sexually explicit imagery without the persons consent.

        Stop trying to use open source outrage as a cloak to cover up your mental il

  • unclear to me... (Score:3, Interesting)

    by jm007 ( 746228 ) on Tuesday July 09, 2019 @07:06PM (#58898820)
    like crack, weed, hookers and blow..... if for personal use only, what's wrong with it?

    the only thing wrong with it is that it provides self-proclaimed do-gooders a reason to remind us why they're better at deciding right and wrong
    • Re: (Score:3, Insightful)

      by agaku ( 2312930 )
      Of course the software does not invade privacy. It cannot peek below the clothes of anyone. Microsoft, the new owner of github, likes censorship, and I do not like this attitude. Better to look for alternatives like gitlab or use a private server for closed-source development.
    • by AmiMoJo ( 196126 )

      You could argue that taking a photo up someone's skirt, or that installing a camera in the toilet bowl, or hacking their iCloud account for "personal use" is not harming them... But they would probably disagree.

    • if for personal use only, what's wrong with it?

      Actually that applies to all laws since you haven't considered any externalities as to why laws against them exist in the first place.

    • Like crack, weed, hookers and blow ... if for personal use only, what's wrong with it?

      Rumour has it that long term use of some of those damage one's ability to think clearly—including the naive supposition that "personal use only" is a surgical smart bomb that never emits any shrapnel into the surrounding community.

      A particularly pernicious form of shrapnel are these loud, bewildered denials that this shrapnel even exists.

  • "Highly invasive"? (Score:5, Insightful)

    by Anonymous Coward on Tuesday July 09, 2019 @07:11PM (#58898832)

    What did this program supposedly invade? My understanding was that this was just a fancy way of photoshopping genetalia onto clothed photos. Are they saying it steals your keystrokes and browsing history, or do they think that this program somehow divines exactly what a person looks like nude, and uses that to create the images?

    • by gweihir ( 88907 ) on Tuesday July 09, 2019 @07:36PM (#58898922)

      It is basically that people think "this can show everybody nude" (it cannot). Then the fear sets in and the two brain cells the average person has available on a good day shut down. The whole thing was a nice demonstration of what can be done with the technology. The reason it is done for women only is because there is far more training material around, which just happens to be a fact. Hence even the "SEXISM!"-angle does not work. Sure, the whole thing is a bit silly, but that is basically it.

      • by AmiMoJo ( 196126 )

        It's often used for bullying. Photoshopping people nude or into porn and then distributing the images is popular with the kids these days. And some adults.

        Github probably looked at it, couldn't see any potential use that wasn't some kind of abuse (involuntary pornography is basically the only reason for it to exist) and decided they didn't want to be associated.

        • by gweihir ( 88907 )

          It is also unavoidable, unless you know a method to get rid of all the pricks in the human race? And mind, about half of those are female.

          Now, I do agree that the author of this thing may have gotten the balance between creating attention and creating outrage at his creation wrong, but the resulting shit-storm is entirely missing the point.

      • The reason it is done for women only is because there is far more training material around, which just happens to be a fact.

        That was the excuse the author gave. It is also not true. There's plenty of training material around for men. But that might get some of Teh Gay (tm) on him.

      • Right... because out of all the INNUMERABLE possible applications this type of software could be designed for, the obvious thing to do is make something that can sexually harass people.

        The original author showed unbelievably poor judgement in making something like this available to the public. Only a complete and utter fool could possibly have thought that a tool like this wouldn't immediately be abused by the lowest denigrates of society, and that's giving the benefit of the doubt.

        You can rationalize, mak

        • by gweihir ( 88907 )

          And there it is, the full-on panic mode, with all rationality erased. Thanks for the demonstration.

          You are aware that people have been creating fake nude and even porn pictures for ages, right?

    • by Z80a ( 971949 )

      It is also very bad at it.
      The whole thing is just a bait, and it worked EXACTLY as intended.

  • by Anonymous Coward on Tuesday July 09, 2019 @07:12PM (#58898836)

    Thee is NOTHING wrong with this project- indeed it is achieving an early attempt for a method that will be universal 10 years from now. What we have today is the same type of pathetic 'moral' scare that killed freedom in comic books for more than three decades. Today Comic Books have the very forms of content (Walking Dead etc) that the Comic Code was explicitly set up to extinguish.

    Britain is seriously trying to ban ALL sharp knives under the same type of moral panic.

    In the near future it will be simple to carry out a task like faking the removal of clothes on a photograph, and NOT because the tech is about this one particular example, but because the general idea (using massive databases of existing images with AI algorithms) will be trivial and universally available.

    The only way to hold back the tide will be a level of DRACONIAN law enforcement the modern West has never yet witnessed- with societal shockwaves infinitely worse than the 'humiliation' of 'victims' of fake images.

    Today we accept actions in FAKE narratives (movies) that we'd never want to see in the real world- and we allow this because fantasy is not the same as reality. So a kid's movie can contain unthinkable amounts of 'fantasy' violence, and most people don't bat an eyelid.

    When the printing press was invented, the exact same type of moral panic that seeks to ban 'fake nudes' was weaponised by the state to ban access to the printing press in much of the world. I mean, what would happen if anyone was allowed to write and distribute their words without prior state/church censorship. Today, we see the BBC et al ban comments on their website using the exact same argument.

    Moral panics are always evil- and purposely so. Always really a tool of the state that predicates on the idea that the sheeple can not be trusted. But in this case things are at a worse state, for the logical conclusion of the moral panic is to remove general computin gfrom the general population. In other words, the sheeple must have programmable machines taken from them, and only allowed to own computers that run pre-approved software from walled gardens.

    The big tech companies (with the approval of all the regulars on Slashdot) want this. No 'wild' code allowed. All 'wild' code pre-described as "too dangerous to permit".

    There are two futures...
    1) deep fakes etc become a universal and non-important ability of machines most own.
    2) general programmable computers are BANNED for ordinary ownership, and all generally available code is subject to extreme vetting by forces of the state.

    There is no third option. Either computer freedom (where whatever the power of computers makes possible just happens) or no computer freedom, where a dystopian restriction on computers and coding is applied.

    • by vux984 ( 928602 ) on Tuesday July 09, 2019 @08:04PM (#58899028)

      People have been saying general purpose computers would be banned since RMS published his "right to read" in the 90s; then the RIAA/MPAA campaign to end piracy raged for the last couple decades, yet here we are.

      Kind of in the the middle -- copyright still exists and has plenty of teeth despite general purpose computers that can violate it at will coexist with it, and nobody really expects it to change. Sure we've got DRM and HDCP to restrict us, and cracks for DRM and HDCP and workarounds to get around it. Intel shows up with IME; someone else shows up with open source hardware initiatives.

      The pragmatic among us would see plenty of victories and losses for both sides.

      And we persist in this balancing act, never reaching and never likely to reach either extreme.
      aka... "the 3rd option"

      • by Tom ( 822 )

        And we persist in this balancing act, never reaching and never likely to reach either extreme.
        aka... "the 3rd option"

        Which is exactly the point that RMS was making in his story: If we let down our guard, we end up as slaves to other peoples' interests.

        It's not a victory if the result is that you have to keep on fighting. It's a victory when the war is over and you can relax. As long as you're there with your gun in your foxhole, you have resisted defeat, but you haven't won.

    • by fustakrakich ( 1673220 ) on Tuesday July 09, 2019 @08:25PM (#58899136) Journal

      Comic books? They sold X-rays glasses in the backs of those.

    • by AmiMoJo ( 196126 )

      I don't want to defend the UK's censorship, but I have to point out that

      "Britain is seriously trying to ban ALL sharp knives under the same type of moral panic."

      and

      "Today, we see the BBC et al ban comments on their website"

      are not true. The BBC has comments on stories every day and knives are not being banned.

    • Thee is NOTHING wrong with this project

      I'll be distributing pornography featuring you and a subordinate at your workplace. Good luck with HR not firing you for sexual harassment, since they now have proof you were fucking him.

      Also, if you happen to have a spouse, I'll be giving her a copy. Good luck with the divorce.

      In other words, there is PLENTY wrong with involuntary pornography.

  • by LazLong ( 757 ) on Tuesday July 09, 2019 @07:22PM (#58898860)

    "...a highly invasive piece of software..."

    How can this software be called 'invasive?' In and of itself the software isn't invasive; it doesn't act on its own accord. It requires a user. As such, it's use may be invasive, but one can't blame the software for how it is used. Such claptrap.

    • Re:Invasie? (Score:4, Insightful)

      by DNS-and-BIND ( 461968 ) on Tuesday July 09, 2019 @07:42PM (#58898950) Homepage
      It's a dog whistle used by feminists to mean "rapey". You'll see the word in all sorts of literature when they stop short, but you know what they want to call it.
    • by AmiMoJo ( 196126 )

      In the same way that a telephoto lens is invasive. It's a tool designed to facilitate invasion of privacy.

      At least there are some uses for a telephoto lens that are not about invading people's privacy. DeepNude has only one purpose.

  • pornographic code? (Score:2, Insightful)

    by Anonymous Coward

    in what way was the code pornographic? or, did the repo also include training data?

    are we going to raze the libreoffice repos because it could be used to create porn?

    • Put it on a tshirt.

      Didn't we do this the last time people tried to ban math because "bad people" could abuse it?

      I personally see this as a boon for privacy and plausible deniability. It allows the "poison the well" approach, as we've seen several browser/extensions tryingout lately.

  • by Anonymous Coward on Tuesday July 09, 2019 @08:05PM (#58899034)

    What about Firefox and Chromium? I've been told that those softwares are often used to view porn degrading to women and read FOX News. Better ban those too, GitHub!

  • I meant GitHub has the right to kick any project off its server. Its server is its property afterall. Neither do I care about the project existance, but using reasoning as vague as this? I mean, does the code stored on the GitHub server contain any nudity, or any pornographic material? This does sound a lot like Apple in the early day trying to prohibit certain app, but give a reason that any web browsers, including its own, wouldn't pass either. I'm not sure anyone ever try this, but what if this app

  • by bjwest ( 14070 ) on Tuesday July 09, 2019 @09:27PM (#58899446)
    It was sexist, after all. Make it work on both male and female, and there wouldn't be such a stink.
    • The sexist angle is just a whole bucket of frosting on top of the already large pervert cake.

      We're talking about a piece of software whose express and sole purpose is to create sexually explicit imagery of people without their consent. Even if it wasn't specifically targeting women, it is still a very distasteful piece of software whose sole purpose of existence is to empower and encourage wierdos.

    • Make it work on both male and female, and there wouldn't be such a stink.

      Have you met the human race?

  • I got my cracked premium copy last week thanks to a 4chan thread. It's utterly useless but I like collecting contraband software.
  • Is Github making themselves responsible for possible uses of applications who's code is hosted on their site? While many of the applications of this code could be viewed as "bad", where is the dividing line? Any sufficiently advanced picture or video editing application can be used in extremely nefarious ways.

    Of course its gets more complicated if the original author wants it removed - but that opens yet another can of worms for open source rights.

    There needs to be a dividing line somewhere - there is a l

  • by Tom ( 822 )

    They did hear about the Streisand Effect, didn't they? I wasn't interested in this type of software, now I am. And I'm absolutely certain copies are out there, somewhere. Had I by accident found one yesterday, I would've shrugged and moved on. Today, I would probably download a copy, try it on some pictures of my wife, and see how realistic it really is. :-)

  • Yeah. You missed it.

    Sure, it's maybe going to be "obscene" content, whatever that means, but who cares. The *problem* is that it's potentially going to be content, whether sexually graphic or not, that puts the faces of people not involved in filming pornographic videos in a position where people will believe that they *were* in pornographic videos.

    Be all puritanical or whatever, but obscenity isn't the problem that this or any other "deepfake" tool is actually causing. I'd be satisfied if any such tool

  • They tried to ban the packs of stolen nude pics from The Fappening and its encores but only managed to spread them more.

    I suspect the same will happen to this application. The more bans the more publicity and thus more demand. Not the smartest idea.

  • Dike porn: when you know you can't possibly win the war against the gathering ocean of human depravity, but you don't wish to be seen as the first person to remove your thumb to let the formerly stalwart but now entirely make-believe dike of decorum crumble.

Wherever you go...There you are. - Buckaroo Banzai

Working...