WhatsApp, Signal and Encrypted Messaging Apps Unite Against UK's Online Safety Bill (bbc.com) 69
WhatsApp, Signal and other messaging services have urged the UK government to rethink the Online Safety Bill (OSB). From a report: They are concerned that the bill could undermine end-to-end encryption - which means the message can only be read on the sender and the recipient's app and nowhere else. Ministers want the regulator to be able to ask the platforms to monitor users, to root out child abuse images. The government says it is possible to have both privacy and child safety. "We support strong encryption," a government official said, "but this cannot come at the cost of public safety. "Tech companies have a moral duty to ensure they are not blinding themselves and law enforcement to the unprecedented levels of child sexual abuse on their platforms. "The Online Safety Bill in no way represents a ban on end-to-end encryption, nor will it require services to weaken encryption." End-to-end encryption (E2EE) provides the most robust level of security because nobody other than the sender and intended recipient can read the message information. Even the operator of the app cannot unscramble messages as they pass across systems - they can be decrypted only by the people in the chat. "Weakening encryption, undermining privacy and introducing the mass surveillance of people's private communications is not the way forward," an open letter warns.
Government's Side Of The Argument (Score:5, Funny)
But think of the children...
But is Government always right? Sometimes it's Left...and sometimes it's just plain wonky.
Re:Government's Side Of The Argument (Score:4, Interesting)
There's three ways to do anything: The right way, the wrong way, and the government way.
The problem is that the government has powers to make your life very, very miserable if you don't comply.
Re: (Score:1)
There's three ways to do anything: The right way, the wrong way, and the government way.
The problem is that the government has powers to make your life very, very miserable if you don't comply.
The 4th way to do things is the Disney way...make the State of Florida and Gubernor Ron the Saint miserable.
Re: (Score:3)
DeSantis, putting the goober in Gubernatorial since 2019
Re: (Score:3)
“Conservative” Ron DeSantis is spending taxpayer money at a rate of $1300 an hour to pick a fight with Disney. https://www.businessinsider.co... [businessinsider.com]
Re: (Score:2)
Ron DeSantis is spending taxpayer money at a rate of $1300 an hour to pick a fight
What a Mickey Mouse way to fight Disney.
Re: (Score:1)
But is Government always right?
The government is made of politicians, who as shown here are people that can't help but lie, even when there is no need to lie, and lying actively harms their goals.
So no, government is never right, they can't seem to help it.
All they have to do is throw the choice out there.
Do you want communications protected at the expense of anyone seeing those communications?
Or do you want to enforce laws on those communications by eliminating protected communications?
I have a feeling a huge number of people would be i
Re: Government's Side Of The Argument (Score:1)
Giving the majority what it wants is arguably why it's almost impossible to buy something which isn't a piece of shit which is broken before it even leaves the store - the majority wants cheap at the expense of everything else - despite that 'cheap is expensive'.
Extrapolate to explain all the ills of the world.
Re: Government's Side Of The Argument (Score:1)
Clearly noone can argue with preventing child exploitation (presumably the reason this benefit is hilighted).
Can't they come up with anything else?
* We're super-nosy and want to eavesdrop on everyone's private time
* We want to go on a fishing expedition so we can determine how to get the best ROI on future plans to make X illegal
?
Re: (Score:2)
I'd be willing to give them at least SOME credit if they were actually trying to prevent child abuse (sexual or otherwise) nearly as actively as the distribution of CSAM.
It's been decades of interwebs with the same old tired "for the children" rally cry when they're not actually doing anything tangible for them. Kind of reminds me of how some folks are approaching abortion...but I digress.
You hear so little about them going after the CREATION of CSAM where a child is abused, and so much about the follow-on
Re: Government's Side Of The Argument (Score:1)
What should happen? And why isn't it?
Re: (Score:2)
Re: (Score:1)
Ministers want the regulator to be able to ask the platforms to monitor users, to root out child abuse images.
They can already ask for transparency.
The government says it is possible to have both privacy and child safety.
It is possible to have opacity, and transparency for child safety.
"We support strong encryption," a government official said, "but this cannot come at the cost of public safety.
We support opacity, except for transparency for "public safety".
"Tech companies have a moral duty to ensure they are not blinding themselves and law enforcement to the unprecedented levels of child sexual abuse on their platforms.
We don't support opacity. We want transparency and we want services to provide transparency.
"The Online Safety Bill in no way represents a ban on end-to-end encryption, nor will it require services to weaken encryption."
We want opacity, we don't want services to provice transparency.
Re: (Score:1)
The British government lost all credibility when they protected one of the most prolific sex offenders of all time, Jimmy Savile. https://en.m.wikipedia.org/wik... [wikipedia.org]
Re: (Score:2)
You might have also heard of this small cult-like organization that's been around for a bit and knowns for harboring sex offenders.
They call themselves the catholic church or something like that.
Re: (Score:3)
Maybe it will be like those scanners and printers that refuse to scan/print money.
Re: (Score:2)
Maybe it will be like those scanners and printers that refuse to scan/print money.
Depending on how the law is worded, this might actually be sufficient. If a regulator asks a platform to monitor its users for child sexual abuse, the platform turns on a switch in the app which causes every image to be compared to images of child sexual abuse provided by the regulator. If there is a match, the app refuses to send the image.
Re: (Score:2)
which, of course, means no end-to-end encryption because your comms are first sent to the tech giant's approval centre first.
The government also quietly adds a clause saying "child porn, and other government-requested data for specified users" and thus the surveillance state is created as the government checks up on anything anyone might send whenever they like, and perhaps even with clauses that say the tech companie scannot inform anyone that they are even checking your comms.
Meanwhile, the real child por
Re: (Score:2)
which, of course, means no end-to-end encryption because your comms are first sent to the tech giant's approval centre first.
No, the image to be tested would stay on the sender's computer and be compared there with images downloaded from the regulator.
Re: (Score:2)
I don't think you've thought this through. You seem to be suggesting the regulator will send you illegal images, all of them, burying you in data and automatic crime.
The way things like this are typically done is you send them a hash of your image and they perform a hash check and respond with a Yes/No. Ignoring the huge potential for misuse here, all it takes is a single p
Re: (Score:2)
It hardly matters either way. Criminals will always find a way to break the law. That and any kind of database of existing media to watch for won't stop
Re: (Score:2)
There's no reason that the local app couldn't have the hashes if all you really care about is not being able to send copies of existing material that's been identified. There are also plenty of algorithms that produce hashes which don't depend on single pixel or other common types of changes that are designed to handle these types of common manipulations.
I would be interested in evaluating a hash that will correctly flag the vast majority of test images as belonging to the database, and will not incorrectly flag any images that do not belong to the database.
It hardly matters either way. Criminals will always find a way to break the law. That and any kind of database of existing media to watch for won't stop anyone from sending newly created content.
Apparently, new child sexual abuse images are expensive to produce. Most trading in such images is done with existing material, which is in the database.
Re: (Score:2)
Apparently, new child sexual abuse images are expensive to produce. Most trading in such images is done with existing material, which is in the database.
Perhaps for now, recent examples of machine learning 'art' suggest this may not hold for long. In which case you might not want to publish any picture at all of your kids, anywhere.
snake
Re: (Score:3)
Apparently, new child sexual abuse images are expensive to produce. Most trading in such images is done with existing material, which is in the database.
Perhaps for now, recent examples of machine learning 'art' suggest this may not hold for long. In which case you might not want to publish any picture at all of your kids, anywhere.
snake
Interesting. There is no way to avoid publishing pictures of one's children, short of living in a cave. I wonder what the law would do about a picture that was generated from a prompt like "Make a picture of a child being abused". The law forbidding pictures of child abuse gets around the First Amendment by saying that such pictures are evidence of child abuse, but a synthesized picture isn't evidence of anything. This is an area that the law will struggle with, I think.
Re: client-side scanning (Score:1)
They just need to update the laws to reflect their intention more clearly:
* Thou shall not engage in thoughts which would lead others to believe you are a danger.
I'm waiting for this same discussion once the user of assistive brain implants is widespread.
Re: (Score:2)
"No, the image to be tested would stay on the sender's computer and be compared there with images downloaded from the regulator."
I don't think you've thought this through. You seem to be suggesting the regulator will send you illegal images, all of them, burying you in data and automatic crime.
The images can be encrypted with the regulator's public key. The sender's computer would encrypt each of its images using the regulator's public key before comparing it to the downloaded encrypted images. Having these encrypted images would not be illegal because without the regulator's private key they cannot be viewed.
Storage is cheap. It should be possbile to store an encrrypted copy of every image in the regulator's illegal image database on a standard cell phone. If not, the regulator can pay the e
Re: (Score:1)
The images can be encrypted with the regulator's public key. The sender's computer would encrypt each of its images using the regulator's public key before comparing it to the downloaded encrypted images. Having these encrypted images would not be illegal because without the regulator's private key they cannot be viewed.
Storage is cheap. It should be possbile to store an encrrypted copy of every image in the regulator's illegal image database on a standard cell phone. If not, the regulator can pay the extra cost.
You think all phones should have an encrypted folder containing all known CSAM images/videos?! Constantly updated, I assume. You say storage is cheap, but we'd be talking about many, many terabytes of space. This is the most asinine thing I have ever heard.
Re: (Score:2)
The images can be encrypted with the regulator's public key. The sender's computer would encrypt each of its images using the regulator's public key before comparing it to the downloaded encrypted images. Having these encrypted images would not be illegal because without the regulator's private key they cannot be viewed.
Storage is cheap. It should be possbile to store an encrrypted copy of every image in the regulator's illegal image database on a standard cell phone. If not, the regulator can pay the extra cost.
You think all phones should have an encrypted folder containing all known CSAM images/videos?! Constantly updated, I assume. You say storage is cheap, but we'd be talking about many, many terabytes of space. This is the most asinine thing I have ever heard.
I strarted using computers in 1963, when core memory was a dollar a word. Storage prices have fallen like a cliff since them. Even if the database does occupy terrabytes of space, that amount of storage will soon be cheap. I just bought a 16 TB disk through e-bay for $125 plus shipping.
Re: client-side scanning (Score:1)
Clearly the way to go is to have a trained machine learning model local to the app vet each image prior to acceptance. Benefits:
* No dodgy images on your device
* Probably stable use of space on your device
Re: (Score:2)
Clearly the way to go is to have a trained machine learning model local to the app vet each image prior to acceptance. Benefits: * No dodgy images on your device * Probably stable use of space on your device
The problem with this solution is that we don't know how to reliably distinguish images of child abuse from other images. It is similar to recognizing pornography and harder than recognizing spam. Create a model that will reliably distinguish spam from non-spam, and we can talk about distinguishing images of child abuse from fine art.
Re: client-side scanning (Score:1)
No need. Here's a post from over four years ago claiming 99.9% spam detection (presumably the model is even better now):
* https://workspace.google.com/b... [google.com]
Am I misunderstanding ? If so, please enlighten me...
Re: (Score:2)
No need. Here's a post from over four years ago claiming 99.9% spam detection (presumably the model is even better now): * https://workspace.google.com/b... [google.com]
Am I misunderstanding ? If so, please enlighten me...
I don't think you are misunderstanding, but their statistics don't correspond to reality. I have a gmail account which gets lots of spam messages eash day, and only about 75% of them are automatically routed to the spam folder. Perhaps the spammers have gotten better at avoiding Google's filters in the last four years.
A filter for child abuse images that failed to catch 25% of the images would be inadequate, in my opinion.
Re: client-side scanning (Score:1)
Really? That's not my experience. I've been using it since 2005 and could probably count all spam messages on two hands
Re: (Score:2)
Really? That's not my experience. I've been using it since 2005 and could probably count all spam messages on two hands
Perhaps I'm doing something wrong, or perhaps I am being targeted by an especially gifted class of spammers. I am not aware of any gmail settings which control the strength of spam detection--am I missing something? I access gmail through IMAP, so I see spam in a folder.
There are a few people with my name who use johnsauter@gmail.com as a "throwaway" email address: they write it down when they don't want to give their real e-mail address. Perhaps that is what attracted the spammers to me.
Re: client-side scanning (Score:1)
Sorry to hear that. I can only vaguely remember <many& gt; variations of:
* Britney Spears Naked
* Enlaaaaaaarge ur p3..N_!_5
Etc.
Re: client-side scanning (Score:1)
Mmm. No preview from mobile
Re: (Score:2)
That brings absolutely no useful value. A simple hash would give the same result and be much smaller. Encryption only has value if there is Decryption, without that all you need is a hash. Why in the world would you want to send millions of regulator content vice one user content, that's crazy!
"It is possible to do a series of Fuzzy Hashes to determine near matches but that begs the question of exactly what is near."
"I don't think
Re: (Score:2)
" .... The images can be encrypted with the regulator's public key."
That brings absolutely no useful value. A simple hash would give the same result and be much smaller. Encryption only has value if there is Decryption, without that all you need is a hash. Why in the world would you want to send millions of regulator content vice one user content, that's crazy!
I suggested encryption rather than hashing because I want to avoid hash collisions. I agree that a hash of the image is good enough, provided the hash is long enough to provide a very low probability that a legal image will have the same hash as an illegal image.
"It is possible to do a series of Fuzzy Hashes to determine near matches but that begs the question of exactly what is near."
"I don't think it is possible to create a fuzzy hash for illegal images that will not either miss some illegal images or incorrectly identify legal images as illegal."
You apparently don't understand that you are arguing by repeating the same statement rephrased.
I was disagreeing with the premise that such fuzzy hashing is possible without addressing the question of how to determine if an image is "near enough" to an illegal image to also be regarded as illegal..
Re: (Score:2)
Would that mean transmitting every known child porn image to the suspect's phone?
Re: (Score:2)
Would that mean transmitting every known child porn image to the suspect's phone?
Yes, though as noted in a previous response they could be encrypted.
Re: (Score:2)
That might take quite a while, and necessitate expanding the phone's storage a bit. If they encrypt, changing even a single pixel on the image to be sent would make the comparison fail or they would have to use encryption so weak that the system might itself legally constitute distribution of child porn.
If they're serious about stopping the problem, they're just going to have to put their coffee or tea down and go check on the wellbeing of children.
Of course, they could also try following up on the evidence
Re: (Score:3)
That might take quite a while, and necessitate expanding the phone's storage a bit. If they encrypt, changing even a single pixel on the image to be sent would make the comparison fail or they would have to use encryption so weak that the system might itself legally constitute distribution of child porn.
If they're serious about stopping the problem, they're just going to have to put their coffee or tea down and go check on the wellbeing of children.
Of course, they could also try following up on the evidence they already have from the whole Epstein debacle.
I suspect combating child porn is just an excuse--what they really want is to be able to monitor all communications. If that is true, a proposal like this one, that does not allow them to snoop on everyone, will be rejected, though they might have to think a bit to come up with a reason.
Re: (Score:2)
... the platform turns on a switch in the app which causes every image to be compared to images of child sexual abuse provided by the regulator.
How many gigabytes of child sexual abuse images have to be included in the app download to make this work?
Re: (Score:2)
... the platform turns on a switch in the app which causes every image to be compared to images of child sexual abuse provided by the regulator.
How many gigabytes of child sexual abuse images have to be included in the app download to make this work?
The download would happen only when the switch was turned on, and of course it would have to be kept up to date as the database of illegal images changes. I don't know how large the database is, but storage is cheap: it should be possible to store it on a cell phone. Perhaps the regulator could subsidize cell phones and forbid those without enough storage.
Re: (Score:2)
Depending on how the law is worded, this might actually be sufficient. ... If there is a match, the app refuses to send the image.
You can bet the government will want to be informed about it.
Re: (Score:2)
Depending on how the law is worded, this might actually be sufficient. ... If there is a match, the app refuses to send the image.
You can bet the government will want to be informed about it.
I am sure you are right. However, they will have to come up with a reason for why refusing to transmit the image doesn't solve the problem.
Proof governments don't care about security! (Score:4, Interesting)
Yes, illegal stuff happens, and the internet makes it easier in a lot of cases, but remove the internet from the equation. Laws exist, so to protect everyone else, should we monitor everyone, at all times, under the government's strict control and surveillance? Basically, if you want to go for a walk, you should first register the route with the government, just to make sure they approve.
That's the level of violation we're talking about, frankly it's getting to the point they'll want to monitor your bedroom, to make sure your sheets consent to being slept in.
Re: Proof governments don't care about security! (Score:1)
I suspect they've got serious invasion-of-privacy envy:
"WE'RE the Government! If anyone should be tracking the thoughts of every citizen, it should be us goddamit! Not some nerd who lives under a red bridge! Find a way to make it happen or you'll be re-provisioned as Secretary for The Arts. Pro-Tip: mention child abuse, noone can object, even if you are using it to justify setting their grandmother on fire. Get to work biatch!"
Re: (Score:2)
Comparing problems (Score:5, Insightful)
Re: (Score:3)
This is really the key.
Child abuse has been there since forever and is not new due to technology. Let's not forget the actual crime is the child being abused. If it's done in secret and only 10 perverts see it gathering in a basement on a projector, versus 1000 due to whatspp... the actual abuse of the child has not changed.
While possession of child porn is a crime, we should always remember the metric for success of a policy should be on reducing the number of children being abused.
I guess one really valid
Re: (Score:2)
This is really the key.
Child abuse has been there since forever and is not new due to technology. Let's not forget the actual crime is the child being abused. If it's done in secret and only 10 perverts see it gathering in a basement on a projector, versus 1000 due to whatspp... the actual abuse of the child has not changed.
While possession of child porn is a crime, we should always remember the metric for success of a policy should be on reducing the number of children being abused.
I guess one really valid possibility is that tracking images might lead to the actual abusers, assuming they download/share as well. Although my hunch would be they probably don't use Whatsapp or signal as they're not really sharing platforms like say Torrents.
Abstract arguments can be made about increasing demand or such things which may or may not be valid (I have no idea), but I've not seen a whole lot of actual 'save the children' action from most governments.
The idea of maybe sending a message or hash of a message to a scanner before a message is E2E encrypted might be viable technically. It would basically break whatever trust is in the system because now you're having to trust whatever government monitoring system is there and they don't change their criteria...
If they had the technology they would make imagining child porn illegal. They'd make fantasising about committing a crime illegal. Thus annihilating much of literature and cinematography. But its for the children, so meh.
Re: (Score:2)
How much abuse would this stop? Probably none at all. This is about _pictures_ of child abuse being _sent_. It is not concerned with the creation of those pictures. And children that have their abuse not being documented (probably the vast majority) are apparently not a concern at all. The whole thing is a big, fat lie.
Re: (Score:2)
How much abuse would this stop? Probably none at all. This is about _pictures_ of child abuse being _sent_. It is not concerned with the creation of those pictures. And children that have their abuse not being documented (probably the vast majority) are apparently not a concern at all. The whole thing is a big, fat lie.
The theory is that if there is a market for pictures of child abuse, there will be an incentive to create it, and the creation involves abusing children.
Re: Comparing problems (Score:1)
Redundant - it's from the government.
Corporate arguments are blatantly stupid (Score:1)
They are concerned that the bill could undermine end-to-end encryption - which means the message can only be read on the sender and the recipient's app and nowhere else
Why would any company oppose the above? And if they oppose, what's their alternative? It's like they're literally asking for permissions to steal messages. And the way they word their argument is like they're doing the world a favor.
Re: (Score:2)
Really - not the problem and a bad solution (Score:3)
Think of the children - but REALLY THINK! (Score:3)
The best thing we can do for our kids - beyond ensuring the survival of our species and our civilization - is to say no to institutionalized, legalized, normalized privacy violation. So it's absurdly ironic that the Brits are using a "think of the children" argument here in an attempt to justify their authoritarian-bordering-on-dictatorial spying.
Any government that insists on the right to routinely examine and read its citizen's private communications, is an authoritarian regime in the making. It would be best to nip this shit in the bud right now - that's the thing to do if we REALLY care about "the children".
Re: Think of the children - but REALLY THINK! (Score:1)
'Think of the children' is really 'We are coercing you to do what we say by leveraging your humanity against you' - the fact that children are involved is incidental. If most people agreed that custard pies should be mandatory in every film scene (ProTip: The way cigarettes are), that would be the focus. It's about:
* Coverage (What prop
Here we go again (Score:2)
Worthless law (Score:2)
It is not that hard to have a separate app that codes a message and copy/paste that into an email or other messaging app.
In fact that concept pre-dates the end-to-end decryption method. If they do not provide end to end encryption, you can easily provide your own encryption by hand.
Trying to stop end-to-end decryption will just result in casual/honest people having bad encryption while the serious/dishonest people will find other ways to get real encryption.
Online safety is pretty bad now (Score:1)
The Apps could strike back ... (Score:2)
After all, I'm sure that Signal, Telegram, Facebook etc would love to sell "high security" versions of their software to governments, and it's clear that govern