Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
United Kingdom IT Technology

UK To Criminalize Deepfake Porn Sharing Without Consent (techcrunch.com) 116

Brace for yet another expansion to the UK's Online Safety Bill: The Ministry of Justice has announced changes to the law which are aimed at protecting victims of revenge porn, pornographic deepfakes and other abuses related to the taking and sharing of intimate imagery without consent -- in a crackdown on a type of abuse that disproportionately affects women and girls. From a report: The government says the latest amendment to the Bill will broaden the scope of current intimate image offences -- "so that more perpetrators will face prosecution and potentially time in jail."

Other abusive behaviors that will become explicitly illegal include "downblousing" (where photographs are taken down a women's top without consent); and the installation of equipment, such as hidden cameras, to take or record images of someone without their consent. The government describes the planned changes as a comprehensive package of measure to modernize laws in this area.

This discussion has been archived. No new comments can be posted.

UK To Criminalize Deepfake Porn Sharing Without Consent

Comments Filter:
  • by Anonymous Coward

    The only thing worse than being governed by the anglophobes of Brussels is being governed by the English themselves.

    • protecting victims of revenge porn, pornographic deepfakes

      Victims? I think phrasing this in such terms muddles the conversation. How can they be a victim? Maybe if you sling it as authentic and then use that to shame or disrupt someones life, but there are already slander (or libel, since it is more written (in code) than it is said?) laws to punish such an act. Put a deepfake watermark in the corner and be done with it! What's next? Illegal to imagine your favorite celebrity crush? Drawing them in sexual ways? No. None of this.

      • by suutar ( 1860506 )

        "revenge porn" can be authentic. It's the distribution without consent that's a problem.
        "deepfake porn" is a separate item.

    • Comment removed based on user account deletion
  • Free Speech? (Score:2, Interesting)

    by backslashdot ( 95548 )

    Would such criminal law be possible in the US, or would some justices in the Supreme Court consider blatant lying a form of free speech like they did in 2012 (US vs. Alvarez)?

    • Re: (Score:3, Interesting)

      by cayenne8 ( 626475 )

      ...or would some justices in the Supreme Court consider blatant lying a form of free speech

      I've not read the particular citation you quoted, but just considering it face value...why would lying NOT be free speech?

      Telling a lie is not against the law in most instances.

      The cops can lie to you all they want.

      About the only time you can't lie is if committing fraud/libel/slander, or when the Feds are questioning you, in which case the best thing to do is clam up and lawyer up.

      But there's no compelling law t

      • What am I missing?

        That your average person is an idiot.

      • I've not read the particular citation you quoted, but just considering it face value...why would lying NOT be free speech?

        Fraud. Slander. Duh.

        • I've not read the particular citation you quoted, but just considering it face value...why would lying NOT be free speech?

          Fraud. Slander. Duh.

          You must have missed that part where I listed Fraud and slander (and even libel) as exceptions.

      • I believe that in certain cases lying is definitely free speech, but then it isn't black & white. Lying with the intent of causing material harm is actionable speech. For example, if a woman falsely tells her husband she was raped by a specific person while knowing there was a high chance the husband would shoot that person .. wouldn't that be murder? Or more clear cut, that some poisonous chemical was safe to drink.

        • by Pieroxy ( 222434 )

          Or more clear cut, that some poisonous chemical was safe to drink.

          Like, for example, saying that drinking bleach could cure you from COVID ?

      • Why wouldn't deep fakes be considered a form of defamatory speech? Now, of course, at least in most Common Law jurisdictions, that's a civil tort, but I see no reason why one couldn't create a criminal statute. Perhaps it would be harder in the US, but it certainly would be no issue for the UK, where constitutional rights are pretty much whatever Parliament says they are. In either case, it certainly remains a form of defamatory speech, perhaps could even be viewed as criminal harassment.

      • But there's no compelling law to force you to tell the truth

        Indirectly, there is. Get subpoenaed to testify before Congress (the Feds, for the purpose of making false statements) and see how that works out for you.

        • Indirectly, there is. Get subpoenaed to testify before Congress (the Feds, for the purpose of making false statements) and see how that works out for you.

          Maybe you've missed earlier when I mentioned it was a problem if you lied to the Feds.

      • Telling a lie is not against the law in most instances.

        Depends on the lie. Spreading falsehoods about some political party taking your guns away = ok, slandering or lying in a way that causes a harm directly linked to your lie on the other hand = not ok, as Alex Jones just found out recently.

    • Re: (Score:2, Interesting)

      So deep fake is supposed to be indistinguishable from reality. So the real first question would be, how do you spot it?
      Then the answer to your question might then first be: Would you consider it slander or defamation of character if someone were to insert your likeness into graphic gay sex porn, or some extreme kink porn that is highly deviant even if legal; and then distribute it on the internet? I'll cut to the chase, in 23 states there are still laws against criminal libel. Yes you can go to jail. Person

      • by HiThere ( 15173 )

        Try "indistinguishable from reality under casual observation". Certainly current deep-fakes can be detected, but they can also fool people who don't do thing like check for watermarks, look for irregular pixel shading, and sometimes even just count eye-blinks.

        • "Most current deep-fakes can be detected"

          FTFY

          The technology that has improved deep-fakes - GAN - can, with sufficient processing power, create an image that is undetectable. Omitting watermarks is obvious. Pixel shading is a matter of adequate rendering. Number and rate of eye-blinks can be randomized to become indistinguishable from the real thing. Asymmetry is already included. Changing light sources - and their effect on eyes - is already at the point where some PCs can render something undetectable

      • by cstacy ( 534252 )

        It is criminal to post pictures of others naked on the internet.

        Is that in the UK (I guess; or, what country, or "since when")? Not sure what law you are referring to.

    • Criminal? Maybe not. Unauthorized use of personal likeness is a copyright thing. Or related anyway:

      https://en.wikipedia.org/wiki/... [wikipedia.org]

    • by ranton ( 36917 )

      or would some justices in the Supreme Court consider blatant lying a form of free speech like they did in 2012 (US vs. Alvarez)?

      The majority opinion did not claim blatant lying could not be outlawed. It ruled that the Stolen Valor Act specifically was too broad. They also felt the government couldn't show the general perception of military awards was diluted by false claims, like those Alvarez made.

      There are a number of free speech restrictions related to lying, so I'm not sure what your point is other than complaining about a single ruling you didn't like.

    • I think it may fall under the same types of logic as Libel and Slander laws. Where public figures will have a harder case to claim that they are a victim than use normies.

      For example a Deep Fake of Scarlett Johansson could be illegal, while a similar deep fake of Scarlett Johansson as Black Widow would be legal. As one would be a misleading show of a real person, while the other is based on a fictional character. However the line would be complex, and probably will have a long legal battle.
      However if a d

  • taking and sharing of intimate imagery without consent

    So what about generating such porn, is that illegal?

    The problem is, eventually we will get programs powerful enough to allow anyone to generate deepfake porn on their home PC, using some porn material on one hand, and enough video/pictures from a certain person which you wish to be the porn star. Then everybody can home generate porn of any celebrities or anyone you can get hold of enough video of. By then it will be too late to stuff the genie back into the bottle.

    • It's probably not illegal, but perhaps they'll extend the relevant law to make it so. In many countries, there already is a law against making or posessing child pornography, whether real, deepfaked or even simulated. I'm not sure about the reasons for that law, simulated kiddie porn seems like a victimless crime, but perhaps it has an adverse affect on those who enjoy that sort of thing (like a gateway drug), or it was simply done to close a few legal loopholes and make it easier to prosecute purveyors o
      • Create an animation showing, say, an adult diddling a child, and that's CSAM in UK, despite no C being SA'd.

    • by Anonymous Coward

      taking and sharing of intimate imagery without consent

      So what about generating such porn, is that illegal?

      The problem is, eventually we will get programs powerful enough to allow anyone to generate deepfake porn on their home PC, using some porn material on one hand, and enough video/pictures from a certain person which you wish to be the porn star. Then everybody can home generate porn of any celebrities or anyone you can get hold of enough video of. By then it will be too late to stuff the genie back into the bottle.

      The problem that we don't want to stuff the genie into the bottle, but the bottle into the genie.

    • So what about generating such porn, is that illegal?

      I came here with the exact same question.

      Will they eventually make it illegal to distribute an app or website where entering in any known names or images of public figures is allowed?

      It's like saying you are going to ban the distribution of hot earl grey tea in the world of Star Trek, where anyone can consume it in their chambers on demand.

      • We do this all the time. We have laws against shouting "FIRE!" unnecessarily in crowded theatres: anyone CAN yell it though. We have laws against murder, but anyone CAN murder someone. We have trade laws that prevent the import and sale of lots of different goods, to poke holes at your Star Trek analogy.
        You can already use Photoshop to make images of Emma Watson nude doing lewd things, but we aren't banning Photoshop, that would be silly.
        There's already been rulings about this sort of thing in the U.S.
        https [towardsdatascience.com]

  • I keep hearing that a lot: "disproportionally affects women and girls", whether it's about climate change, poverty, road rage, shrinkflation or apparently deep fake porn. While in this case I don't doubt that it is true, I wonder: does that really make it worse? Does that statement add anything to the article?
    • No, it's just a variation on an old journalism trope; "NYC Nuked! Women and minorities hardest hit."
    • I wonder: does that really make it worse?

      What makes it worse for women is that it's typically perpetrated by men against women. It doesn't make the problem a worse problem. But since women are generally already suffering from a lot of other things men are doing to them, it does mean it might be worth spending more effort on it.

    • I keep hearing that a lot: "disproportionally affects women and girls", whether it's about climate change, poverty, road rage, shrinkflation or apparently deep fake porn. While in this case I don't doubt that it is true, I wonder: does that really make it worse? Does that statement add anything to the article?

      Decision makers tend to be men of some privilege, so when a problem affects another group (women, minorities, etc) they tend to downplay its significance.

      Think of statements like that as a reminder the victims still exist even though the reader isn't individually impacted.

  • ...The installation of equipment, such as hidden cameras, to take or record images of someone without their consent.

    And all cctv cameras in public places are now illegal.

    • From TFS:

      the taking and sharing of intimate imagery without consent

      Intimate is the key word here. Unless you are walking down the street buck nekkid, no laws have been violated. In fact, if you are showing it to the general public in such a manner, you'd have a difficult time claiming any sort of intimacy.

  • by sxpert ( 139117 ) on Friday November 25, 2022 @11:58AM (#63078996)

    "and the installation of equipment, such as hidden cameras, to take or record images of someone without their consent."

    that could be construed as : all semi-visible security cameras are now illegal

    • That was my first thought too. You can bet the bill will be so badly worded that some enterprising lawyer will use this as a defence to get some low life off a burglary charge.

    • I'm sure there will be a bunch of clauses similar to "except where such imagery is captured as part of lawful surveillance for the purposes of crime reduction".

      I'm also sure that it will be worded in such a way to ensure that it will be illegal for you to film the police while they are freely filming you.

    • Doesn't the British government have a shit-ton and half of cameras recording all around London 24/7? Who will they put in jail for those?
    • by djb ( 19374 )

      CCTV is already covered by data protection laws, as long as you make it visible, have a warning sign and register as an information holder your fine.

  • Does anyone envision a scenario where anyone would consent to deepfakes of them being publicaly released so what the hell does "consent" mean?

    • Porn stars might consent to that. Someone generates new content with their likeness, and they consent to the sale of that content in exchange for a cut.
    • James Earl Jones just licensed his voice for future work. That would be consenting to Deep Fakes. If you, a porn actor, were faced with any specific reason to leave the industry, but were offered a percentage residual for the option to deep fake your likeness into movies, that would probably be an effort worth considering. This also gives protections to industry workers who would otherwise appear in works that they never agreed to.

      Beyond shaming and revenge porn, there's absolutely use cases for this law.

      • James Earl Jones just licensed his voice for future work. That would be consenting to Deep Fakes. If you, a porn actor, were faced with any specific reason to leave the industry, but were offered a percentage residual for the option to deep fake your likeness into movies, that would probably be an effort worth considering. This also gives protections to industry workers who would otherwise appear in works that they never agreed to.

        Beyond shaming and revenge porn, there's absolutely use cases for this law.

        But JEJ (or his estate) dit not license it to that Eastern European company which did it first.

    • Does anyone envision a scenario where anyone would consent to deepfakes of them being publicaly released

      It depends, in this deepfake, am I fat?

    • I mean in Hollywood the concept of a body-double is somewhat common thing. Sometimes people thought they'd seen a nude scene with an actress but it had actually been a body-double.

      It could be that the actor or actress has someone else play a scene in a movie nude and gives permission to deepfake their face onto it so that they don't have to do the scene themselves.

    • by HiThere ( 15173 )

      Sure. Some actors have signed contracts with studios allowing the studios to use artificially generated images of them. Those are clearly deep-fakes.

  • I'd like to see a porn site that finds a reliable, programmatic way to reliably replace all faces in submitted video with AI-generated deepfakes of people that don't actually exist to protect privacy and victimization. It would solve the problem of even a submitted video that is deepfaked.

    • Either that, or use the faces of stars from the black and white era. As an example, imagine a modern porn made in glorious living color, but appearing to show Rudolph Valentino and Gloria Swanson.
  • There's a lot of hate directed at this new legislation in the comments above but this legislation makes a rational sense. How would you feel yourself if your face was used in porn without your consent or knowledge?

    • The UK needs this because they don't have a "right of publicity", they do have something called "passing off" where you can't pretend to be someone you aren't but that doesn't cover all the same activity.

    • There's a lot of hate directed at this new legislation in the comments above but this legislation makes a rational sense. How would you feel yourself if your face was used in porn without your consent or knowledge?

      I'd spam it to every TV station on the planet.

    • Obviously making simulated porn of someone and distributing it should be covered by general harassment laws, and it probably is. So it looks like a moral panic to me - at best. At worst, it could be the first step to forbid things that could be used for this sort of thing, a.k.a the war on general purpose computing.

      Or maybe someone wants a tool to get those people posting photoshopped pictures of their alleged dead pig activity [wikipedia.org].

      • by namgge ( 777284 )
        AIUI the key thing about the proposal is it removes the need to prove 'intent' to cause harm / distress in a victim, which has been an obstacle to date.
    • OHMIGOD! Someone is thinking sexual thoughts about me without my consent.

      They may even make a lifelike pic of me in a compromising position without my consent (already covered in photoshoped images, in which case the argument isn't about the media, but the methods used).

      Nevermind the multitudes of porn parodies from Star Wars to Edward Penis Hands; will those become illegal too? Exactly how realistic until it becomes affront to my character?

      They may curb aspects of this if money is changing hands (then it b

  • What level of exposed cleavage infers that they want you to look at their tits?

    Or are we judicially rationalizing the difference between "looking" and "LOOKING" now? Is a 1.5 second look offensive? 3 seconds? What if we only use one eye?

    I'm asking because we seem to be carving into actionable, citeable legislation what used to be simply considered rude and impolite, therefore specific definitions are now relevant.

    Because the most absurd thing would be to suggest that they never want to you notice their ti

    • First, let’s get this out of the way: there’s a special place in hell for people who post revenge porn and distribute non-consensual deepfake porn. Definitely should be criminalized.

      That being said, this post has a point. Cleavage images are another matter. Pictures of cleavage happen all the time, often deliberate but often not. While the MRA-tone of the post is trolly, its true that many women often deliberately display their secondary sexual characteristics, for their own benefit and enj
      • What are you two on about, this is an article about deep fakes.

        If a person posts a picture of themselves that's their choice, if you post a faked picture of them that's not their choice, extent of cleavage shown isn't material, it's about consent.

        • The article also discusses the inclusion of “downblousing” pictures as well. That’s why we’re discussing.

          While the idea of making cleavage photos illegal to post without consent certainly sounds great, in practice it means that posting just about any photo of a crowd would be illegal, and any picture taken at a beach, a swimming pool, dance hall or coffee shop would be a sex crime.

          Better to separate that from the revenge porn and deepfake issue, so we can make those things VE
  • One of the arguments in favor of criminalizing "deepfakes" is that we're saving the women and the children. Unfortunately this is just a precursor to banning "other ones and zeroes" which make up, among other things, encryption, authentication, and digital restrictions management (DRM). The UK would like to make ones and zeroes criminal but only in a particular order.

    It's a sure thing that if The Matrix was produced in 2022 the UK would claim that the green characters floating down the screen could also r

  • They have more surveillance than china. I sure as fuck didn't consent to being recorded.
  • by mandark1967 ( 630856 ) on Friday November 25, 2022 @04:43PM (#63079550) Homepage Journal

    Guy1 - (Rings up mate) What's up mate?
    Guy2 - I'm bloody tired mate. I just spent 18 hours making the perfect deepfake porn of Kate Middleton being made airtight buy 3 large Great Danes. Sending it your way as soon in a sec. You'll love it!
    Guy1 - Matey; Haven't you heard?! UK's criminalized sharing deepfake porn without consent.
    Guy2 - WTF?!
    Guy1 - Yeah. They broadened the scope of current image offenses so that more violators will actual face jail time!
    Guy2 - Bloody hell!! Thanks for letting me know mate! I bloody well don't want to get nicked for sharing it without consent!
    Guy2 - So...do you consent to receiving copy or not?!
    Guy1 - Send it mate!

  • First of all, I'd encourage folks to read the bill itself [parliament.uk] directly.

    Putting aside some (valid, interesting) questions on free speech here to discuss something else -- how can the State prove that a given image or video is a deep fake? The output of a AI model may resemble a given person but it seems hard to prove that it qualifies as "manufactured" that way unless the creator discloses the training data & prompt used to seed it -- assuming they even have it anymore by the time the CPS comes asking about

  • This book ended with laws requiring everybody who looked to much like a highly public figure or star being required to have surgery to change their image in order to not violate the image copyright.
  • Stopping various forms of non-consenting porn is fine, but for me that is way down on the risks associated with deep fakes. I think there are far more serious issues when deep fakes are used to make the public believe a politician or other politician has made statements that they did not in fact make

    This could throw elections. Cause wide scale economic disruption. Create panic, violence etc. Imagine right before an election, a video that appears to be a hot-mike of a candidate saying something strong
  • A much better solution would be to require that all generated content is explicitly labeled as such. Then who cares if someone uses your likeness to make porn, or draw a cartoon, or whatever - anyone watching would know it's fake (or at the very least could easily find out).
  • A deep pr0n video of Trump and Putin [censored], will the UK put me in jail?

An authority is a person who can tell you more about something than you really care to know.

Working...