UK To Criminalize Deepfake Porn Sharing Without Consent (techcrunch.com) 116
Brace for yet another expansion to the UK's Online Safety Bill: The Ministry of Justice has announced changes to the law which are aimed at protecting victims of revenge porn, pornographic deepfakes and other abuses related to the taking and sharing of intimate imagery without consent -- in a crackdown on a type of abuse that disproportionately affects women and girls. From a report: The government says the latest amendment to the Bill will broaden the scope of current intimate image offences -- "so that more perpetrators will face prosecution and potentially time in jail."
Other abusive behaviors that will become explicitly illegal include "downblousing" (where photographs are taken down a women's top without consent); and the installation of equipment, such as hidden cameras, to take or record images of someone without their consent. The government describes the planned changes as a comprehensive package of measure to modernize laws in this area.
Other abusive behaviors that will become explicitly illegal include "downblousing" (where photographs are taken down a women's top without consent); and the installation of equipment, such as hidden cameras, to take or record images of someone without their consent. The government describes the planned changes as a comprehensive package of measure to modernize laws in this area.
Brexit (Score:1)
The only thing worse than being governed by the anglophobes of Brussels is being governed by the English themselves.
Re: (Score:1)
protecting victims of revenge porn, pornographic deepfakes
Victims? I think phrasing this in such terms muddles the conversation. How can they be a victim? Maybe if you sling it as authentic and then use that to shame or disrupt someones life, but there are already slander (or libel, since it is more written (in code) than it is said?) laws to punish such an act. Put a deepfake watermark in the corner and be done with it! What's next? Illegal to imagine your favorite celebrity crush? Drawing them in sexual ways? No. None of this.
Re: (Score:2)
"revenge porn" can be authentic. It's the distribution without consent that's a problem.
"deepfake porn" is a separate item.
Re: (Score:2)
Free Speech? (Score:2, Interesting)
Would such criminal law be possible in the US, or would some justices in the Supreme Court consider blatant lying a form of free speech like they did in 2012 (US vs. Alvarez)?
Re: (Score:3, Interesting)
I've not read the particular citation you quoted, but just considering it face value...why would lying NOT be free speech?
Telling a lie is not against the law in most instances.
The cops can lie to you all they want.
About the only time you can't lie is if committing fraud/libel/slander, or when the Feds are questioning you, in which case the best thing to do is clam up and lawyer up.
But there's no compelling law t
Re: (Score:2)
What am I missing?
That your average person is an idiot.
Re: (Score:1)
What am I missing?
That your average person is an idiot.
copyright law and trademark law as they apply to images.
Re: (Score:2)
I've not read the particular citation you quoted, but just considering it face value...why would lying NOT be free speech?
Fraud. Slander. Duh.
Re: (Score:1)
You must have missed that part where I listed Fraud and slander (and even libel) as exceptions.
Re: (Score:2)
I believe that in certain cases lying is definitely free speech, but then it isn't black & white. Lying with the intent of causing material harm is actionable speech. For example, if a woman falsely tells her husband she was raped by a specific person while knowing there was a high chance the husband would shoot that person .. wouldn't that be murder? Or more clear cut, that some poisonous chemical was safe to drink.
Re: (Score:1)
Or more clear cut, that some poisonous chemical was safe to drink.
Like, for example, saying that drinking bleach could cure you from COVID ?
Re: (Score:2)
Why wouldn't deep fakes be considered a form of defamatory speech? Now, of course, at least in most Common Law jurisdictions, that's a civil tort, but I see no reason why one couldn't create a criminal statute. Perhaps it would be harder in the US, but it certainly would be no issue for the UK, where constitutional rights are pretty much whatever Parliament says they are. In either case, it certainly remains a form of defamatory speech, perhaps could even be viewed as criminal harassment.
Re: (Score:2, Interesting)
Is protecting a woman's dignity something you don't like? Is this some sort of Incel response? I don't understand the moral graveyard that a comment like this comes from.
Re: (Score:2)
I don't understand the moral graveyard that a comment like this comes from.
He had a bad relationship with his mom, and extrapolated it out to "all women are bad."
Re: (Score:2)
Even if it is clearly labelled as fake, it's still involuntary sexualization of the victim.
Re: Free Speech? (Score:2)
But there's no compelling law to force you to tell the truth
Indirectly, there is. Get subpoenaed to testify before Congress (the Feds, for the purpose of making false statements) and see how that works out for you.
Re: (Score:1)
Maybe you've missed earlier when I mentioned it was a problem if you lied to the Feds.
Re: (Score:1)
Telling a lie is not against the law in most instances.
Depends on the lie. Spreading falsehoods about some political party taking your guns away = ok, slandering or lying in a way that causes a harm directly linked to your lie on the other hand = not ok, as Alex Jones just found out recently.
Re:Free Speech? (Score:5, Insightful)
So go ahead and do all the Nancy Pelosi X Donald Trump slash fiction you want, complete with hand drawn illustrations. Because nobody can mistake that for reality. But if you use computers to do it to the point where someone could reasonable think it's real we got a problem.
Re: (Score:2)
I expect this will happen in an election in the not too distant future. It probably won't be pornographic. Maybe even just a deepfake of their voice saying something that damages their campaign.
Re: (Score:1)
Re: Free Speech? (Score:2, Insightful)
Because who decides what is a lie and what is a truth.
The earth is round, a few centuries ago that was a lie that could get you burned at the stake, it is still a lie technically speaking.
Only people with delusions of grandeur and power hungry despots would claim that only âoetruthâ is protected speech.
Re: (Score:3)
You realize that is what a trial does right? A bunch of people sit around deciding whether the prosecution or the defense is telling the truth. Life or death is often decided that way.
Re: (Score:2)
You realize that is what a trial does right? A bunch of people sit around deciding whether the prosecution or the defense is telling the truth. Life or death is often decided that way.
Yes, exactly, it's an expensive, imperfect process that can't possibly be used to pervasively regulate all speech.
It's difficult enough to use for serious crime.
Re: Free Speech? (Score:2)
Well, no. Nobody since the days of early Greece have believed in anything other than a round Earth.
Re: (Score:2)
That's not true. There have been people since then in (probably) every decade who knew that, but there've been lots of people who didn't. (For that matter, I believe that there are people right now who don't know that the Earth is approximately round. It depends on which sources you trust. *You* aren't going to do the experiments.)
Re: (Score:2)
I've done the experiment. It involves a sufficiently large lake you know the length of and a tall structure at one end (a tree works).
You measure the height using trigonometry and the known distance from close to the object and then from the other end of the lake.
Since the object hasn't shrunk, the change in measurement is due to the curvature of the Earth. You can use this to calculate the circumference. Accuracy is good.
Re: (Score:2)
It's quite interesting, then, that people have done that kind of experiment, and come up with the opposite conclusion. I think it partially has to do with error bars. (For one thing, light is bent near the ground by the thicker atmosphere. This is probably a too slight an effect to be noticeable most of the time, but consider mirages.)
IIUC, the argument about the top of a ship being the first part visible is fallacious. That's one that makes the size of the earth appear wrong. (I can't remember the de
Re: (Score:2)
Because who decides what is a lie and what is a truth.
The earth is round, a few centuries ago that was a lie that could get you burned at the stake, it is still a lie technically speaking.
None of that is true. Saying the earth is round was not a "lie". It was heresy and heresy is not interesting in honesty but dogma.
I think generally the replies to my comment confuse "being wrong" with "lying". Lying is a deliberate act and very often done specifically to cause damage to some party. I don't see any reason why a deliberate act of lying should be protected in any way from censorship or punishment, any more than any other deliberate act which causes harm.
Re: (Score:1)
Again, the person that makes it a capital punishment whether something is a lie or something is true is the one with true power (and all government decisions on that are ultimately capital punishments)
Re: (Score:2)
Again, the person that makes it a capital punishment whether something is a lie or something is true is the one with true power (and all government decisions on that are ultimately capital punishments)
Oh, we've moved on from the government not protecting lies to making it a capital office to lie, have we?
Piss off.
Re: (Score:2)
What is the difference?
What's the difference between having to remove a falsehood from a website and being hung? Gee. Why don't you go to China and find out?
Re: (Score:2)
In your world, if you refuse to remove the "falsehood" as the government dictates, you get shot or jailed, like China.
OK. You are still the only person here calling for the death penalty.
Going back to the "jailed" - so what? We jail people for things all the time. Why is lying so sacred to you that it should be above the law? What is it you're so keen to lie about?
Re: (Score:2)
Isn't it the death penalty when the police come knocking at your door with a SWAT team?
Why would they do that?
The ultimate remediation for the government is violence. If you don't pay your taxes, they come to your door with a police officer, if you don't agree to go with them, they will threaten you to go with them dead or alive.
You can say that about anything; it's meaningless to shout "help! help! I'm being repressed!" every time that you get into trouble for something illegal, no matter how trivial.
Same here, if you don't agree that something is a lie, then they come to your door with a police officer, if you don't agree with them, you will be taken by force.
You've skipped a bit in your urge to prove your paranoid delusion. What you're actually saying is that if you don't agree to come to a trial where they will have to prove your guilt then they will come to your door. In the US this may ultimately result in you being shot, but that's a problem with the US.
What is so hard to understand, truth and lies, as I previously demonstrated, is relative
Claiming
Re: (Score:2)
Why should [lying be treated as free speech]?
Because all speech should be considered free speech unless restricting speech would prevent significant damage to others. There are restrictions which criminalize lying, such as defamation, libel, and slander. But overall lying should still be considered protected in most cases.
In the court case referenced by the GP, it wasn't protected lying. It was claiming a law restricting lying was overly broad and the lying wasn't causing clear damage.
Re: (Score:2)
Re: (Score:2)
Are you nuts? Lying is and should remain very legal under most circumstances. The SCOTUS affirmed that [justia.com]. Explain why, for instances in that case and your opinion, what Alvarez said should be illegal.
I don't care what a bunch of old foreign judges say about this. Lying - not "being wrong" is not something that should be protected. It just doesn't make sense.
Re: (Score:2)
If by "foreign judges" you're inferring that you don't live in the U.S., then I can understand where you're coming from. Until you've had freedom of speech like I do in the U.S., you don't know what you're missing. And yes, I'm an immigrant from Iran.
But I do want to make sure you understand what you're advocating. In the U.S., there are criminal violations, and then there are non-criminal violations. The government making something illegal equates to it being a criminal violation. Any violation that's
Re: (Score:2)
Why should it not?
One would start with the precipice that ALL speech is free...and only limit as needed from there.
Besides...who is the arbiter of truth?
Re: (Score:2)
Why should it not?
One would start with the precipice that ALL speech is free...and only limit as needed from there.
Yes. And we need to not protect lying.
Besides...who is the arbiter of truth?
I'm not talking about truth; I'm talking about lying - the deliberate statement of something the speaker knows not to be true.
Re: (Score:2)
What am I missing?
That lying should not be treated as free speech.
Why should it?
Because it's impossible to establish on a society-wide, pervasive basis who or what is "lying".
Unless you want a Minitru. Which could totally work and not go wrong, I'm sure ...
Re: (Score:2, Interesting)
So deep fake is supposed to be indistinguishable from reality. So the real first question would be, how do you spot it?
Then the answer to your question might then first be: Would you consider it slander or defamation of character if someone were to insert your likeness into graphic gay sex porn, or some extreme kink porn that is highly deviant even if legal; and then distribute it on the internet? I'll cut to the chase, in 23 states there are still laws against criminal libel. Yes you can go to jail. Person
Re: (Score:2)
Try "indistinguishable from reality under casual observation". Certainly current deep-fakes can be detected, but they can also fool people who don't do thing like check for watermarks, look for irregular pixel shading, and sometimes even just count eye-blinks.
Re: (Score:2)
FTFY
The technology that has improved deep-fakes - GAN - can, with sufficient processing power, create an image that is undetectable. Omitting watermarks is obvious. Pixel shading is a matter of adequate rendering. Number and rate of eye-blinks can be randomized to become indistinguishable from the real thing. Asymmetry is already included. Changing light sources - and their effect on eyes - is already at the point where some PCs can render something undetectable
Re: (Score:2)
It is criminal to post pictures of others naked on the internet.
Is that in the UK (I guess; or, what country, or "since when")? Not sure what law you are referring to.
Re: (Score:2)
https://nypost.com/2021/03/18/... [nypost.com]
Re: (Score:2)
https://www.legalvoice.org/non... [legalvoice.org]
Re: (Score:2)
Criminal? Maybe not. Unauthorized use of personal likeness is a copyright thing. Or related anyway:
https://en.wikipedia.org/wiki/... [wikipedia.org]
Re: (Score:2)
or would some justices in the Supreme Court consider blatant lying a form of free speech like they did in 2012 (US vs. Alvarez)?
The majority opinion did not claim blatant lying could not be outlawed. It ruled that the Stolen Valor Act specifically was too broad. They also felt the government couldn't show the general perception of military awards was diluted by false claims, like those Alvarez made.
There are a number of free speech restrictions related to lying, so I'm not sure what your point is other than complaining about a single ruling you didn't like.
Re: (Score:2)
I think it may fall under the same types of logic as Libel and Slander laws. Where public figures will have a harder case to claim that they are a victim than use normies.
For example a Deep Fake of Scarlett Johansson could be illegal, while a similar deep fake of Scarlett Johansson as Black Widow would be legal. As one would be a misleading show of a real person, while the other is based on a fictional character. However the line would be complex, and probably will have a long legal battle.
However if a d
Re: (Score:2)
Re: (Score:1)
Better yet, Trump on Clinton deepfake action!
yech
Re: (Score:2)
You need help. Seek professional advice.
What about generating? (Score:2)
taking and sharing of intimate imagery without consent
So what about generating such porn, is that illegal?
The problem is, eventually we will get programs powerful enough to allow anyone to generate deepfake porn on their home PC, using some porn material on one hand, and enough video/pictures from a certain person which you wish to be the porn star. Then everybody can home generate porn of any celebrities or anyone you can get hold of enough video of. By then it will be too late to stuff the genie back into the bottle.
Re: (Score:2)
Re: (Score:2)
Create an animation showing, say, an adult diddling a child, and that's CSAM in UK, despite no C being SA'd.
Re: (Score:1)
taking and sharing of intimate imagery without consent
So what about generating such porn, is that illegal?
The problem is, eventually we will get programs powerful enough to allow anyone to generate deepfake porn on their home PC, using some porn material on one hand, and enough video/pictures from a certain person which you wish to be the porn star. Then everybody can home generate porn of any celebrities or anyone you can get hold of enough video of. By then it will be too late to stuff the genie back into the bottle.
The problem that we don't want to stuff the genie into the bottle, but the bottle into the genie.
Re: (Score:1)
So what about generating such porn, is that illegal?
I came here with the exact same question.
Will they eventually make it illegal to distribute an app or website where entering in any known names or images of public figures is allowed?
It's like saying you are going to ban the distribution of hot earl grey tea in the world of Star Trek, where anyone can consume it in their chambers on demand.
Re: (Score:2)
We do this all the time. We have laws against shouting "FIRE!" unnecessarily in crowded theatres: anyone CAN yell it though. We have laws against murder, but anyone CAN murder someone. We have trade laws that prevent the import and sale of lots of different goods, to poke holes at your Star Trek analogy.
You can already use Photoshop to make images of Emma Watson nude doing lewd things, but we aren't banning Photoshop, that would be silly.
There's already been rulings about this sort of thing in the U.S.
https [towardsdatascience.com]
"Affects women and girls" (Score:1)
Re: "Affects women and girls" (Score:2, Troll)
Re: (Score:2)
I wonder: does that really make it worse?
What makes it worse for women is that it's typically perpetrated by men against women. It doesn't make the problem a worse problem. But since women are generally already suffering from a lot of other things men are doing to them, it does mean it might be worth spending more effort on it.
Re: (Score:1)
I keep hearing that a lot: "disproportionally affects women and girls", whether it's about climate change, poverty, road rage, shrinkflation or apparently deep fake porn. While in this case I don't doubt that it is true, I wonder: does that really make it worse? Does that statement add anything to the article?
Decision makers tend to be men of some privilege, so when a problem affects another group (women, minorities, etc) they tend to downplay its significance.
Think of statements like that as a reminder the victims still exist even though the reader isn't individually impacted.
cctv cameras (Score:1)
...The installation of equipment, such as hidden cameras, to take or record images of someone without their consent.
And all cctv cameras in public places are now illegal.
Re: cctv cameras (Score:2)
From TFS:
the taking and sharing of intimate imagery without consent
Intimate is the key word here. Unless you are walking down the street buck nekkid, no laws have been violated. In fact, if you are showing it to the general public in such a manner, you'd have a difficult time claiming any sort of intimacy.
CCTV surveillance now illegal ? (Score:3, Interesting)
"and the installation of equipment, such as hidden cameras, to take or record images of someone without their consent."
that could be construed as : all semi-visible security cameras are now illegal
Re: (Score:2)
That was my first thought too. You can bet the bill will be so badly worded that some enterprising lawyer will use this as a defence to get some low life off a burglary charge.
Re: (Score:1)
I'm sure there will be a bunch of clauses similar to "except where such imagery is captured as part of lawful surveillance for the purposes of crime reduction".
I'm also sure that it will be worded in such a way to ensure that it will be illegal for you to film the police while they are freely filming you.
Re: (Score:2)
Re: (Score:2)
CCTV is already covered by data protection laws, as long as you make it visible, have a warning sign and register as an information holder your fine.
"Deepfake Porn Sharing Without Consent " (Score:2)
Does anyone envision a scenario where anyone would consent to deepfakes of them being publicaly released so what the hell does "consent" mean?
Re: (Score:2)
Re: (Score:2)
James Earl Jones just licensed his voice for future work. That would be consenting to Deep Fakes. If you, a porn actor, were faced with any specific reason to leave the industry, but were offered a percentage residual for the option to deep fake your likeness into movies, that would probably be an effort worth considering. This also gives protections to industry workers who would otherwise appear in works that they never agreed to.
Beyond shaming and revenge porn, there's absolutely use cases for this law.
Re: (Score:1)
James Earl Jones just licensed his voice for future work. That would be consenting to Deep Fakes. If you, a porn actor, were faced with any specific reason to leave the industry, but were offered a percentage residual for the option to deep fake your likeness into movies, that would probably be an effort worth considering. This also gives protections to industry workers who would otherwise appear in works that they never agreed to.
Beyond shaming and revenge porn, there's absolutely use cases for this law.
But JEJ (or his estate) dit not license it to that Eastern European company which did it first.
Re: (Score:2)
Does anyone envision a scenario where anyone would consent to deepfakes of them being publicaly released
It depends, in this deepfake, am I fat?
Re: (Score:2)
I mean in Hollywood the concept of a body-double is somewhat common thing. Sometimes people thought they'd seen a nude scene with an actress but it had actually been a body-double.
It could be that the actor or actress has someone else play a scene in a movie nude and gives permission to deepfake their face onto it so that they don't have to do the scene themselves.
Re: (Score:2)
Sure. Some actors have signed contracts with studios allowing the studios to use artificially generated images of them. Those are clearly deep-fakes.
Future of porn should all be deepfaked by default (Score:2)
I'd like to see a porn site that finds a reliable, programmatic way to reliably replace all faces in submitted video with AI-generated deepfakes of people that don't actually exist to protect privacy and victimization. It would solve the problem of even a submitted video that is deepfaked.
Re: (Score:2)
Re: (Score:2)
Are they out of copyright yet? You might need to wait a decade.
It kind of makes sense. (Score:2)
There's a lot of hate directed at this new legislation in the comments above but this legislation makes a rational sense. How would you feel yourself if your face was used in porn without your consent or knowledge?
Re: (Score:2)
The UK needs this because they don't have a "right of publicity", they do have something called "passing off" where you can't pretend to be someone you aren't but that doesn't cover all the same activity.
Re: (Score:1)
There's a lot of hate directed at this new legislation in the comments above but this legislation makes a rational sense. How would you feel yourself if your face was used in porn without your consent or knowledge?
I'd spam it to every TV station on the planet.
Re: (Score:2)
Obviously making simulated porn of someone and distributing it should be covered by general harassment laws, and it probably is. So it looks like a moral panic to me - at best. At worst, it could be the first step to forbid things that could be used for this sort of thing, a.k.a the war on general purpose computing.
Or maybe someone wants a tool to get those people posting photoshopped pictures of their alleged dead pig activity [wikipedia.org].
Re: (Score:3)
Re: (Score:2)
OHMIGOD! Someone is thinking sexual thoughts about me without my consent.
They may even make a lifelike pic of me in a compromising position without my consent (already covered in photoshoped images, in which case the argument isn't about the media, but the methods used).
Nevermind the multitudes of porn parodies from Star Wars to Edward Penis Hands; will those become illegal too? Exactly how realistic until it becomes affront to my character?
They may curb aspects of this if money is changing hands (then it b
genuine question (Score:1)
What level of exposed cleavage infers that they want you to look at their tits?
Or are we judicially rationalizing the difference between "looking" and "LOOKING" now? Is a 1.5 second look offensive? 3 seconds? What if we only use one eye?
I'm asking because we seem to be carving into actionable, citeable legislation what used to be simply considered rude and impolite, therefore specific definitions are now relevant.
Because the most absurd thing would be to suggest that they never want to you notice their ti
Re: (Score:2)
That being said, this post has a point. Cleavage images are another matter. Pictures of cleavage happen all the time, often deliberate but often not. While the MRA-tone of the post is trolly, its true that many women often deliberately display their secondary sexual characteristics, for their own benefit and enj
Re: (Score:2)
What are you two on about, this is an article about deep fakes.
If a person posts a picture of themselves that's their choice, if you post a faked picture of them that's not their choice, extent of cleavage shown isn't material, it's about consent.
Re: (Score:2)
While the idea of making cleavage photos illegal to post without consent certainly sounds great, in practice it means that posting just about any photo of a crowd would be illegal, and any picture taken at a beach, a swimming pool, dance hall or coffee shop would be a sex crime.
Better to separate that from the revenge porn and deepfake issue, so we can make those things VE
Criminalizing ones and zeroes (Score:1)
One of the arguments in favor of criminalizing "deepfakes" is that we're saving the women and the children. Unfortunately this is just a precursor to banning "other ones and zeroes" which make up, among other things, encryption, authentication, and digital restrictions management (DRM). The UK would like to make ones and zeroes criminal but only in a particular order.
It's a sure thing that if The Matrix was produced in 2022 the UK would claim that the green characters floating down the screen could also r
So the UK says you need consent to video someone? (Score:1)
Consent is important! (Score:3)
Guy1 - (Rings up mate) What's up mate?
Guy2 - I'm bloody tired mate. I just spent 18 hours making the perfect deepfake porn of Kate Middleton being made airtight buy 3 large Great Danes. Sending it your way as soon in a sec. You'll love it!
Guy1 - Matey; Haven't you heard?! UK's criminalized sharing deepfake porn without consent.
Guy2 - WTF?!
Guy1 - Yeah. They broadened the scope of current image offenses so that more violators will actual face jail time!
Guy2 - Bloody hell!! Thanks for letting me know mate! I bloody well don't want to get nicked for sharing it without consent!
Guy2 - So...do you consent to receiving copy or not?!
Guy1 - Send it mate!
How is this supposed to work in practice? (Score:2)
First of all, I'd encourage folks to read the bill itself [parliament.uk] directly.
Putting aside some (valid, interesting) questions on free speech here to discuss something else -- how can the State prove that a given image or video is a deep fake? The output of a AI model may resemble a given person but it seems hard to prove that it qualifies as "manufactured" that way unless the creator discloses the training data & prompt used to seed it -- assuming they even have it anymore by the time the CPS comes asking about
think I read this book (Score:2)
Missing the real risks of deep fakes (Score:2)
This could throw elections. Cause wide scale economic disruption. Create panic, violence etc. Imagine right before an election, a video that appears to be a hot-mike of a candidate saying something strong
Re: (Score:2)
You don’t have copyright for you likeness in the U.K.
A much better solution - clear, mandatory labeling (Score:2)
If I make (Score:2)
A deep pr0n video of Trump and Putin [censored], will the UK put me in jail?