How Facial Recognition Tech Is Being Used In London By Shops - and Police (bbc.co.uk) 98
"Within less than a minute, I'm approached by a store worker who comes up to me and says, 'You're a thief, you need to leave the store'."
That's a quote from the BBC by a wrongly accused customer who was flagged by a facial-recognition system called Facewatch. "She says after her bag was searched she was led out of the shop, and told she was banned from all stores using the technology."
Facewatch later wrote to her and acknowledged it had made an error — but declined to comment on the incident in the BBC's report: [Facewatch] did say its technology helped to prevent crime and protect frontline workers. Home Bargains, too, declined to comment. It's not just retailers who are turning to the technology... [I]n east London, we joined the police as they positioned a modified white van on the high street. Cameras attached to its roof captured thousands of images of people's faces. If they matched people on a police watchlist, officers would speak to them and potentially arrest them...
On the day we were filming, the Metropolitan Police said they made six arrests with the assistance of the tech... The BBC spoke to several people approached by the police who confirmed that they had been correctly identified by the system — 192 arrests have been made so far this year as a result of it.
Lindsey Chiswick, director of intelligence for the Met, told the BBC that "It takes less than a second for the technology to create a biometric image of a person's face, assess it against the bespoke watchlist and automatically delete it when there is no match."
"That is the correct and acceptable way to do it," writes long-time Slashdot reader Baron_Yam, "without infringing unnecessarily on the freedoms of the average citizen. Just tell me they have appropriate rules, effective oversight, and a penalty system with teeth to catch and punish the inevitable violators."
But one critic of the tech complains to the BBC that everyone scanned automatically joins "a digital police line-up," while the article adds that others "liken the process to a supermarket checkout — where your face becomes a bar code." And "The error count is much higher once someone is actually flagged. One in 40 alerts so far this year has been a false positive..."
Thanks to Slashdot reader Bruce66423 for sharing the article.
That's a quote from the BBC by a wrongly accused customer who was flagged by a facial-recognition system called Facewatch. "She says after her bag was searched she was led out of the shop, and told she was banned from all stores using the technology."
Facewatch later wrote to her and acknowledged it had made an error — but declined to comment on the incident in the BBC's report: [Facewatch] did say its technology helped to prevent crime and protect frontline workers. Home Bargains, too, declined to comment. It's not just retailers who are turning to the technology... [I]n east London, we joined the police as they positioned a modified white van on the high street. Cameras attached to its roof captured thousands of images of people's faces. If they matched people on a police watchlist, officers would speak to them and potentially arrest them...
On the day we were filming, the Metropolitan Police said they made six arrests with the assistance of the tech... The BBC spoke to several people approached by the police who confirmed that they had been correctly identified by the system — 192 arrests have been made so far this year as a result of it.
Lindsey Chiswick, director of intelligence for the Met, told the BBC that "It takes less than a second for the technology to create a biometric image of a person's face, assess it against the bespoke watchlist and automatically delete it when there is no match."
"That is the correct and acceptable way to do it," writes long-time Slashdot reader Baron_Yam, "without infringing unnecessarily on the freedoms of the average citizen. Just tell me they have appropriate rules, effective oversight, and a penalty system with teeth to catch and punish the inevitable violators."
But one critic of the tech complains to the BBC that everyone scanned automatically joins "a digital police line-up," while the article adds that others "liken the process to a supermarket checkout — where your face becomes a bar code." And "The error count is much higher once someone is actually flagged. One in 40 alerts so far this year has been a false positive..."
Thanks to Slashdot reader Bruce66423 for sharing the article.
Did Sara sue for slander ? (Score:5, Insightful)
Facewatch admitted their mistake. Sara suffered as a result (cried, etc). She is due compensation.
Mr Thompson might also want to sue the police. Did they destroy the finger prints and copy of his passport that they had taken ?
Re: (Score:1)
People should boycott stores that use that garbage
Re: (Score:2)
People should boycott stores that use that garbage
It's Home Bargains, people with any self respect already avoid it.
Re: (Score:1)
Why would the police destroy a passport? They don't own it. If they seize one (with a court order, or brought in as lost property, or from a search of a 3rd party) they return it to the Home Office - their supervising ministry, whose property it always has been and alwa
Re: (Score:2)
Re: (Score:2)
There's probably a claim under the Equality Act too, given these things rarely work very well for dark skin.
Re: Did Sara sue for slander ? (Score:2)
Re: (Score:2)
Facewatch admitted their mistake. Sara suffered as a result (cried, etc). She is due compensation.
Mr Thompson might also want to sue the police. Did they destroy the finger prints and copy of his passport that they had taken ?
In the UK there is hardly any need... The conclusion is so forgone they'll look to compensate without a lawsuit.
Over here, if your flight arrives more than 2 hours late you're entitled to compensation (4 hours if it's a long haul fight).
GDPR means they would have had to destroy all Personally Identifiable Information, they'd be risking huge fines if they didn't.
Re: Unlawful search and seizure? (Score:2)
I will vehemently oppose if this shifts from crime deterrence and goes as advertising - that is unacceptable. But for safety, it is a tricky situation. As someone fit into the law abiding citizens group, I have to accept there is way too much crime and me being recorded is a consequence of much larger problems - maybe to be solved at next elections. Or the next, or the next, or just learn to live in a new reality because folks cannot see the big picture and keep making and/or supporting wrong decisions.
Re: (Score:3)
but would The Fourth Amendment to the United States Constitution protect Americans from this sort of invasion of privacy.
No. The 4th Amendment only applies to the government. Thus, the need for warrants by law enforcement. Also, there is no "search" of your person when entering a business which uses facial recognition. Everyone can see you.
I say that because having a machine process my identity against my authorization is a form of identity thef
No, it's not. Nothing is being stolen. As stated above, eve
Re: (Score:2)
...outright banning it wouldn't be in anyone's interest.
I vehemently disagree. I think banning it would be in the interests of everyone who values their privacy.
In the first place, I take issue with The Met's assertion that "It takes less than a second for the technology to create a biometric image of a person's face, assess it against the bespoke watchlist and automatically delete it when there is no match." What assur... no, what proof do we have that the images are permanently deleted? Even if they are being deleted, what assurances are there that they won't
Re: (Score:2)
I think banning it would be in the interests of everyone who values their privacy.
How can it be about privacy when you're out in the open? You are in a public business where everyone who can see you.
Back in the day, casinos had literal books of pictures of people. They would compare those pictures to people who came into the casino to see if they had been banned, previously caused an issue, "won" too much, and so on. Those pictures were permanent. They never got rid of the pictures.
At the very least stores
Re:Unlawful search and seizure? (Score:4, Insightful)
How can it be about privacy when you're out in the open? You are in a public business where everyone who can see you.
The privacy invasion comes subsequently, when your stored facial data allows you to be tracked everywhere there are cameras. (Keeping in mind that AFAIC all that data will be stored and used to track people). In England, there are a LOT of cameras, and the number is growing at an alarming rate.
Back in the day, casinos had literal books of pictures of people. They would compare those pictures to people who came into the casino to see if they had been banned, previously caused an issue, "won" too much, and so on. Those pictures were permanent. They never got rid of the pictures.
Did they keep photographs of every patron, or just the problematic ones? I rather suspect it was the latter. Also, "back in the day" it was pretty much impossible to make an unlimited number of copies of a photo in seconds, and it was certainly impossible to spread them almost instantaneously around the entire globe.
At the very least stores could be directed to fully delete all pictures after a certain time frame (7 - 30 days) and not store those images with any permanency.
Your trust that those images will ever be deleted strikes me as almost willfully naive.
Target is being sued for that very reason [bloomberglaw.com]. They never informed customers they were storing facial recognition information, or what they were doing with it.
Big companies seldom care about such lawsuits - they're a cost of business. And I'll say it again - personal data is the new currency, and it's precious. Will Target stop collecting and keeping the data? My guess is they won't. Will they inform customers in a meaningful way? Maybe. But what are customers going to do if they object to the images being collected, when soon every retail outlet will be doing the same thing? The law shouldn't be prescribing how to handle that data - it should be forbidding its collection in the first place.
The Illinois Biometric Information Privacy Act requires businesses to provide written notice when biometric data is collected, along with information on how it’s being used, how long it’s being retained, and deletion policies.
Again, informing customers when and how their privacy is being invaded is meaningless when all stores are doing it and customers have to choose between shopping and protecting their privacy. In my view, the disclosure you're heralding as a remedy is in fact equivalent to a peeping Tom who avoids arrest by telling people he's spying on them through their windows. This is the point where you raise the issue of who is on whose property, and this is yet another point on which you and I will inevitably disagree.
So there you go. A limitation, but not prohibition, on using facial recognition.
A limitation which is unenforced and probably unenforceable, on an activity which is becoming nearly impossible to avoid, is hollow and meaningless.
Re: (Score:3)
>"How can it be about privacy when you're out in the open? You are in a public business where everyone who can see you."
Legally that is true.
But the reality is something else, entirely. Back in the old days, we COULD be in public and have SOME reasonable amount of expected privacy. You would only be observed by a small number of dissociated people who actually happen to look at you for a few seconds. You rarely left a paper trail anywhere, either, paying in cash.
Now that has all changed- cameras every
Re: (Score:2)
So when people use the framework of "public place", I ask them how they would feel if a group of police officers followed them around, every second, everywhere they go, the moment they step out of their houses, carrying connected laptops with information about everyone, recording everything you do, everyone you associate with, and storing it for who knows how long. Seems a bit creepy?
No, not to me.
Film me as much as you want. Heck, you can follow me around while filming me, and even approach me for a discussion. If I have the time (usually yes), we can talk about stuff.
This is my expectation when in public. I'm fine with everyone filming everyone else while in public. I'm also fine with storing that shit for eternity, at least there's a chance I will be remembered after kicking the bucket, so to speak.
The advantages heavily outweigh the disadvantages, in my opinion. Imagine: no jaywalki
Re: (Score:2)
Back in the old days, we COULD be in public and have SOME reasonable amount of expected privacy.
VHS security cameras have been common since the 1970s.
Re:Unlawful search and seizure? (Score:4, Interesting)
>"VHS security cameras have been common since the 1970s."
Right. But having occasional cameras and being recorded temporarily isn't the real issue. That has been around most of my life. What video recording was then, and what it has become (and will become) is a different matter. Compared to then, it is the number and spread of cameras (thousands of times more), and the fact they retain such resolution/clarity and can store everything digitally for such a long time, and can be connected to networks, and then to analyzing computers (which are also connected to many other aspects our lives) and ultimately AI.
Yes, it sounds like the "slippery slope" argument, and I suppose it is. But there is a lot of truth to warnings about anything taken to extremes.
A good comparison is car safety. Everyone wants more safety, and we have made TREMENDOUS improvements in my lifetime, to the point that is is actually very unusual to have death or serious injury in bad accidents, instead of being very common. But that just keeps continuing to the extreme... each new addition now has a diminishing return and increased negative externalities, but we "mandate" them, anyway. Cars become ever more expensive, to the point where many people can't afford them anymore. They become so complicated, they get harder and harder (and more expensive) to repair. They annoy and nag, and invade our privacy, autonomy, and enjoyment, in the name of chasing ever-more little bits of safety.
Many don't realize a perfectly safe world is a much more dystopian world than they would imagine. Safety and freedom are always directly at odds. An extremely safe world is one without risk, adventure, freedom, autonomy, invention, enjoyment, aspiration, responsibility, value; one filled with people who cannot adapt, change, or grow.
Re: (Score:2)
How would you feel if you were stopped upon entering every business in your city, and required to show ID? ... You have papers, yes?
This is automating that process. At the very least, there should be a prominent sign on the front door saying "Facial Recognition is in use on these premises".
Re:Unlawful search and seizure? (Score:4, Funny)
How would you feel if you were stopped upon entering every business in your city, and required to show ID?
I'd feel like a Costco customer.
Re: (Score:2)
How would you feel if you were stopped upon entering every business in your city, and required to show ID?
I'd feel like a Costco customer.
Yep, people literally sign up for this treatment.
I always find it odd that Americans get hot and bothered when they read this kind of thing happening in the UK and fail to realise that thanks to far more lax laws in the US the same thing happens over there... It's just that errors don't get picked up and reported, definitely not nationally (or on the international publisher, the BBC). At the very least my bank is expressly forbidden from tracking what I purchase with my card, let alone sharing it with al
Re: (Score:2)
How would you feel if you were stopped upon entering every business in your city, and required to show ID?
I'd feel like a Costco customer.
Yep, people literally sign up for this treatment.
Actually we pay an annual fee for this treatment. Thank you Madame Costco may I have another. :-)
Just wait until you realise that casinos have a private network of shared data of people they monitor entering their premises. Likely run by a 3rd party... and it's not like the US has a law saying this data cannot be shared with a different 3rd party.
And we have private companies with people who have license plate reading cameras on their cars. All four directions so they can run through parking lots and pick up all the cars. Supposedly it started with private bail enforcement companies. When they are driving around looking for one person they collect data, that helps them collect data on some future person they will be looking for, and their friends and fam
Re: (Score:2)
Re: (Score:1)
Re: Unlawful search and seizure? (Score:2)
Treating an innocent as a criminal should be a crime similar to treating a black person as a criminal. All liabilities for racial and gender discrimination should be transposed onto (inaccurate) facial recognition.
Then let the monetary penalty for misuse handle the regulation of its use.
Re: (Score:3)
Unauthorized use of facial imagery may also be a violation of personal copyright.
Re: (Score:2)
If there's money to be made off an individual's image or likeness, then no I'm not:
https://rinckerlaw.com/name-im... [rinckerlaw.com]
Re: (Score:2)
You must live in a very, very dark cave. Learn about the right of publicity.
Re: Unlawful search and seizure? (Score:2)
Err, we have a different system of rights, checks & balances to the US, but they certainly exist. In this case, the European Convention on Human Rights (which is an instrument of the Council of Europe, of which we are founding members, and nothing to do with the European Union) is foremost amongst them, specifically articles 8 & 11 (rights to privacy and freedom of assembly). Then there is the living case law that makes up our unwritten constitution, into which this sort of thing falls as an active
Re: (Score:2)
Re: (Score:1)
The Birthday Problem kicks their ass (Score:5, Insightful)
It's basically impossible to build a system like this that works, and likely always will be.
The core problem is an interesting little math problem called the "Birthday Problem" or the "Birthday Paradox". In its original form, it's about the probability that some two people at a party have the same birthday. Suppose you're at a party where you and your buddy know no one (so neither of you knows anyone's birthdays), and your buddy says "I'll bet you $100 that at least two people at this party share a birthday". Should you take that bet? If there are more than 23 people in the room, you should not, because odds are >50% that at least two people share a birthday. This is counterintuitive because we think "366 possible birthdays, 30 people, that's a lot more birthdays than people, so odds of a shared birthday are low." But what you actually need to think about is how many pairs of people are at the party, because that's the number of possibilities for a match. With 30 people that's 30*28 = 870 pairs.
How does this relate to face recognition or other biometric systems? Biometric matching algorithms are threshold algorithms, not binary-output. That is, they compute how close face A is to face B, according to some complicated metric, and if the distance is below a threshold, we call them a match. But this fuzziness in matching means that the algorithm effectively partitions the space of all faces into a bunch of similarity pigeonholes (this is oversimplified, but a useful approximation), where the number of pigeonholes roughly corresponds to the false accept rate.
Likewise, the birthday question partitions people into 366 pigeonholes. The birthday "false accept rate" (FAR) is roughly 1 in 366 (not quite that because Feb 29 is less likely than other days, but you get the idea).
For a false accept rate (FAR) of 1 in n, you can estimate the number of entries in the system at which you'll have a %gt;50% probability of a false accept for any given entry at about sqrt(n). This means that if you have an FAR of 1:50,000 (which is at the better end of what commercial systems do), you can only put just over 200 people in the database before false accepts happen with more-than-even odds. 200.
In this case, their actual database of faces is probably fairly small, but the space of faces that can be matched against those faces is very large. Tens of millions. Supposing they have 200 thieves in the database and 10M non-thieves, the probability that any random non-thief matches a thief is one minus the probability that the system doesn't get a false match on any of the 200, so 1-(1-p)^200, where p is the FAR. For p = 1/50,000, about 4% of non-thieves will match some thief. Since the base rate is 200 thieves out of 10M non-thieves (0.002%), this means that false matches will outnumber true matches by 200:1.
If this system is only seeing false matches outnumber true matches by 40:1, it's actually doing really well! Likely because their existing thief database is very small, and the system will get worse and worse as more thieves are added.
Bottom line, this kind of thing just doesn't work. What could work, kinda sorta, is if the system showed the store employee the photo and personal info of the known thief, so they could check whether the person matched is actually the same one. This is how stuff like law enforcement fingerprint database queries work: every search returns a bunch of results and then the police first have a human expert look at the matches to throw out some of the false positives, and then they have some legwork to do to figure out if any of the remaining matches are the real deal.
Of course, I said "kinda sorta" because such a system would result in lots of legitimate customers being harassed every time they go to a store. That will be bad for business.
Big biometric matching systems without some sort of additional filter just don't work. Can't work.
Re: (Score:2)
With 30 people that's 30*28 = 870 pairs.
30*29 = 870, obviously.
Also, damn, I should have previewed. Apparently I failed to close a <i>.
Re: (Score:2)
Also, damn, I should have previewed. Apparently I failed to close a <i>.
Don't worry about it. Your comment was both informative and insightful, thereby justifying all that extra emphasis... ;-)
Re: (Score:2)
Big biometric matching systems without some sort of additional filter just don't work. Can't work.
The additional filter can be that the system provides the first name of the supposed thief.
System: detects "Pete" -- a thief!
Staff: Good morning sir, I'm Franck from customer support, how should I call you?
Customer: Dave
Staff: Nice to meet you Dave, can I quickly check your loyalty card or credit card?
Customer: shows something issued to their name, e.g. David X (which is not Pete)
Staff: Thank you Dave, enjoy your shopping.
Re: (Score:2)
What would they do? Show Dave's loyalty card or credit card? Pull it out of their arseholes?
Re: (Score:2)
If they later come back with a loyalty card and a credit card on the name of Dave, they're going to use that to pay, not steal. That's all you need to know.
You won't be able to tell if they really are Dave, if Dave let them use it, or if they murdered Dave for the loyalty card, but that's not your part of the problem.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
That's valid right now.
Technology is advancing, though, and in a few years it will become more precise, to the point where false positives will fall dramatically.
And, yes, filters are being used and polished continuously.
Re: (Score:2)
That's valid right now. Technology is advancing, though, and in a few years it will become more precise, to the point where false positives will fall dramatically.
Very unlikely. The level of precision that's needed for large populations is insane. If you want to be able to get a false match rate of, say, 0.1% in a database of 100M people, you'd need a FAR of about 1:500,000,000,000.
Re: (Score:2)
I think that's what they also said about fingerprints, way back when they were a novelty.
And you don't need a database of 100M people, you need a list of people who are most likely to be in that specific place, at that specific time. I'd venture to say it's going to be less that 10K people, and that's a generous number, more valid for a large city such as London, rather than, say, Dorchester in Dorset.
John the local thief would be in that list, but Jack the convicted thief who is in prison 5K miles away wou
Re: (Score:2)
I think that's what they also said about fingerprints, way back when they were a novelty.
What makes you think that?
When fingerprints were a novelty, automated matching systems didn't exist (computers didn't exist), so matching against databases of significant size was infeasibly labor-intensive and the implications of the Birthday Paradox didn't come up at all.
And you don't need a database of 100M people, you need a list of people who are most likely to be in that specific place, at that specific time. I'd venture to say it's going to be less that 10K people, and that's a generous number, more valid for a large city such as London, rather than, say, Dorchester in Dorset.
You can filter by likely presence, sure. You have to get the possible match database size down, and it has to be much smaller than you think it does.
A few years ago, I worked on building a fingerprint-based authentication system for
Re: (Score:2)
There's a certain distance threshold which, if passed, would make thieving from a certain remote area not worthy.
A general store would therefore be able to have its distance threshold reduced to walking distance, for example, whereas a jewelry store would have a different filter set, e.g. filter out pickpockets, etc.
It's all about HOW the system is implemented.
Re: (Score:2)
Sure, it's possible that there exists some store for which this is useful.
Surveillance works (Score:1)
London's homicide rate is the lowest it has ever been in history. It may not seem like that due to social media, but it's true. Ironically the fact that crime is filmed and propagated in social media makes it seems like it's happening a lot more when it's simply a result of more cameras and people wanting to peddle some narrative.
Re: (Score:2)
Buddy, the homicide rates for nearly all wealthy countries have been going down, down, down for decades, centuries. London is not special in this regard. Even the US has a much lower homicide rate than it used to. It Doesn't have anything to do with British people's incredible cowardice leading them to give up all their privacy and rights. That's just a weird British thing.
Re: (Score:3)
AP’S ASSESSMENT: False. Hospital industry officials and public health experts confirm the federal government provides hospitals with enhanced payments for treating COVID-19 patients, but the payments are only currently applicable to those on Medicare. The enhanced payments, which are slated to end in May, also aren’t contingent on a patient’s death but on the treatment or services provided to the patient, they said.
Re: (Score:2)
None of that where ever true:
AP’S ASSESSMENT: False. Hospital industry officials and public health experts confirm the federal government provides hospitals with enhanced payments for treating COVID-19 patients, but the payments are only currently applicable to those on Medicare. The enhanced payments, which are slated to end in May, also aren’t contingent on a patient’s death but on the treatment or services provided to the patient, they said.
Sorry, your quote (without citation), says otherwise. At best, its restricts the practices I described to government paid programs like medicare. Your quote does NOT say that false reporting did not happen.
Re: (Score:2)
OK, let's assume that were true. How about the 2019 statistics? Or 2023/2024 when there's no covid.
Re: (Score:2)
OK, let's assume that were true. How about the 2019 statistics? Or 2023/2024 when there's no covid.
Then the financial incentive would be missing wouldn't it? Companies do what gov't policy rewards.
Re: (Score:2)
Re: (Score:2)
read again, they where paid for treating live patients with covid, not for dead ones. They got zero money for dead covid patients and your whole stick was that they got money for reporting covid deaths.
No. The claim was that people were reported as covid patients to collect monies even if they were not manifesting any symptoms. That a positive test was a revenue source.
The AP itself misrepresents the claims, saying that it was that hospitals let people die to collect a death benefit. Never heard that. What I recall was that a positive test would result in covid listed among the causes of death regardless as to whether it contributed. That there was a financial incentive to do so. Note that such financi
Re: (Score:2)
Re: (Score:2)
so let's quote your post verbatim: "Of course, hospitals were given a cash reward for deaths labeled as COVID". You are switching the goal posts. And no the dead are not being treated in the sense that generates federal funds.
I'm not saying my original statement is entierly correct. I am saying your response and your citation are also erroneous.
Re: (Score:2)
Hospitals do not receive extra funds when patients die from COVID-19. They are not over-reporting COVID-19 cases. And, they are not making money on treating COVID-19.
The truth is, hospitals and health systems are in their worst financial shape in decades due to COVID-19. In some cases, the situation is truly dire. An AHA report estimates total losses for our nation’s hospitals and health systems of least $323 billion in 2020. There is no windfall here.
Further, hospitals and health systems adhere to strict coding guidelines, and use of the COVID-19 code for Medicare claims is reserved for confirmed cases. Coding inappropriately can result in criminal penalties and exclusion from the Medicare program altogether.
According to the Centers for Disease Control and Prevention, there have been as many as 299,028 more deaths this year compared to a typical year (as of Oct. 15, 2020). There have been 216,025 deaths due to COVID-19. There is no reasonable explanation for the increased deaths other than COVID-19.
Re: (Score:2)
Hospitals do not receive extra funds when patients die from COVID-19.
The controversy was about "cases" not "deaths". About ill-conceived policy that created unexpected incentives. Unintentional consequences is hardly something new, its one of the reasons that policy making is hard.
They are not over-reporting COVID-19 cases.
Simplistic. There are many variable with respect to over/under reporting and a poorly conceived policy is but one variable.
And, they are not making money on treating COVID-19.
That is simplistic. I recall some sort of controversy where some hospitals were making money and some losing, one of the variables being the ratio of privately insured to medic
Re: (Score:2)
Re: (Score:2)
"You misrepresent the original claim" - I quoted AHA verbatim.
LOL. OK, you quoted a misrepresentation.
Re: (Score:2)
Re: (Score:2)
says the guy that thinks the entire pharma industry, all the health care workers in the world and all the worlds governments are all in on a conspiracy....
LOL. And now you are, personally, grossly misrepresenting, downright lying actually.
Re: (Score:2)
Re: (Score:2)
So you are the single person in the world that simultaneous believe that covid was a hoax and real at the same time?
Please keep digging deeper proving you are a liar. Your claims have nothing to do with what I actually said.
Or what on earth are your goal here with spreading the misinformation about hospitals getting payed for counting up the death toll for covid.
"Deaths" may be an overstatement, however it remains true a positive diagnosis would financially reward a hospital. I.e. there was a financial incentive to test non-symptomatic patients so that they could be recorded as covid patients also.
"It is true, however, that the government will pay more to hospitals for COVID-19 cases in two senses: By paying an additional 20% on top of traditional Medicare
Re: (Score:2)
"Right now Medicare has determined that if you have a COVID-19 admission to the hospital, you’ll get paid $13,000. If that COVID-19 patient goes on a ventilator, you get $39,000, three times as much. Nobody can tell me after 35 years in the world of medicine that sometimes those kinds of things impact on what we do."
https://www.factcheck.org/2020... [factcheck.org]
"Even as hospitals and governors raise the alarm about a shortage of ventil
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
This is still not what you claimed and my entire thread of comments is only on your claim that they where inflating the death count in order to get money.
Apparently you missed the several cases where I agreed this seems to be an exaggeration, that the payments were over treatments. Either way the problem is a focus on positive tests rather than symptomatic behavior. More specifically, policy errors have shown over and over you get what you reward. As in the possible overuse of ventilator mentioned in the other response.
And you have the audacity to claim that I'm the one that is lying...
Well, you are. I never made covid hoax comments. Or claimed some massive conspiracy between government, pharma and health care.
Re: (Score:2)
Zero people are claiming that hospitals are not getting money in order to cure people. What you claimed was specifically that they got money for inflating the death count. Stop switching goal posts.
Again, you mischaracterize. It is that you get what you reward. Reward for covid patients and you get positive tests as evidence of covid patients and eligibility of those additional payments, regardless of any symptomatic behavior. The only person thinking this is restricted to deaths is you.
Again, the citations presented show legit concerns regarding over treatment to collect additional gov't funds.
Re: (Score:2)
London's homicide rate is the lowest it has ever been in history.
That could be due to better medical care. Look at attacks rather than fatalities to adjust for that variable.
1:40 seems really good (Score:1)
The summary talks about the 1:40 false positive rate, suggesting that it's a reason to stop. While not negligible 1:40 actually seems really good.
I would expect police officers approaching people looking for suspect X to be wrong most of the time, a 3% incorrect rate is a huge improvement.
Race factors are always a concern in systems like this, but it's probably being monitored and addressed. It's also replacing a human system that is known to have substantial racial biases.
Re: (Score:3)
Re: (Score:2)
how do you catch and deal with false positives?
With diplomacy and staff training. First the system should provide a name of the potentially identified thief, say "Pete" (my example above). A staff member goes there, calls them by that name and see what happens. If they say "sorry I'm not Pete I'm Dave" and show any proof of not being Pete, you just say "oh my bad, have a good day" and nothing happened, they don't even know why you called them Pete, and were never shamed in public or falsely accused.
What should not happen is to treat the face recognition
Re: (Score:3)
Thieves are pretty brassy as a rule, your average clerk won't be able to determine a thief from a false positive. And you might be exposing them to potential violence asking them to confront people.
What you will certainly get is good customers turning into former customers when you treat them like criminals.
Re: (Score:2)
And you might be exposing them to potential violence asking them to confront people.
Darn, mate, the place you live in must be very bleak.
Around here, I can hardly imagine someone having the potential to become violent if being politely confronted - like asking for name or ID. And I live in a third world country.
Re: (Score:2)
Normal people, no. But a thief trying not to get caught? Most likely they're going to run, but they might assault the clerk in their path. And you have to ask if the risk is worth it for minimum wage.
Most stores already have a 'do not engage' policy: give the video to the cops and let them deal with it after the fact.
Re: (Score:2)
Ah, I now understand what you meant.
Still, thieves who rely on stealth would still be less likely to be aggressive.
Re: (Score:2)
The purpose of the system is to detect thieves, and it is apparently correct 38 times out of 40. One solution is to monitor them silently using the cameras. But if the shop manager does not want to accept the 38 thieves in the shop and decided to send them away, the staff will have to confront them. In TFA, the situation you describe is what happened: someone was rudely confronted, called a thief, asked to leave. Unfortunately, this person was the 2/40 false positive.
I propose an improvement of the workflow
"It takes less than a second... (Score:2)
... for the technology to create a biometric image of a person's face, assess it against the bespoke watchlist and automatically delete it when there is no match."
Which is why they want everybody's face, so there's always a match. That match, including time and place, is never deleted, same as any time your details are taken for whatever spurious reason. So there's a few false positives, who fucking cares? They don't.
I love this! (Score:4, Funny)
Just like the SciFi stuff I read as a kid.
Can we have more please? Can we have drones with minguns and this recognition tech zooming around instantly eliminating people? How amazing can it get!!!!!
Right! (Score:2)
Bullshit. Once scanned, it is in their database forever, just waiting for the "thief" bit to be set. She was scanned, it was accidentally set, and will likely happen again to her. Because it is in their software, and the police have it as well.
Re: (Score:2)
Oh, look, another one of those "I don't believe it!" people.
Going to keep posting this... (Score:3)
I've posted this before [slashdot.org], and I'm posting it again, and I'll continue to post it until everyone here understands the danger of facial recognition software.
Let's pretend for a moment that facial recognition software is 99.9% accurate. It's not, but for sake of argument, let's go with it.
Now, pretend you walk into a store, the software flags you as a criminal who shoplifted from the store earlier, and you get arrested. If that store received 5,000 visitors that day, what are the odds you were a false positive, i.e. you were flagged as a criminal but weren't actually the one who committed the crime?
Answer: 80%.
Why? The average Joe wrongly assumes a 99.9% accuracy rate means 99.9% of the time it will identify a criminal correctly. But it does not mean this. What it means is that the system will correctly identify -anybody- 99.9% of the time; flagging a face as "not the criminal" counts to this accuracy score as equally as "is the criminal". So, out of 5,000 people, on average, 0.1%, or five people, will be identified by the system as "criminal". This means that, if you are flagged as the criminal, and you are one out of 5,000 visitors, then there's a 4/5 chance, or 80%, that you are not the criminal.
But here's the kicker: Facial recognition software is being used to convict people. This is the real crime. This technology is an emperor wearing no clothes, but everyone from prosecutors to juries, pretends that it is authoritative. This technology is not 99.9% accurate; at best, it's about 99% accurate, as long as the individual is Caucasian and male. At worst, it's about 65% accurate. [aclu-mn.org] We are endangering the stability of society to take something this inaccurate and use it to convict people of crimes.
Re: (Score:2)
But here's the kicker: Facial recognition software is being used to convict people.
No, it is not.
It is SOMETIMES used in conjunction with many, many other pieces of evidence, but it is not the main prosecution piece, at all.
Stop spewing nonsense.
i smell a class action (Score:2)
Bwahahahahaha (Score:2)
"It takes less than a second for the technology to create a biometric image of a person's face, assess it against the bespoke watchlist and automatically delete it when there is no match"
Yeah, I believe this, never-ever has a system with a capability to collect data been abused. No image is deleted, everything is used to train whatever models they developed, and so on.
And, this being the brexit shithole, you know that law enforcement has access to this, too.
Hey, Winston, do your physical jerks properly, ffs!
Big Brother is growing...... (Score:3)
Goodbye London (Score:2)
I can't imagine why anyone would want to live in or visit such a police state.
My god, it's appalling!
I'm sure that there are plenty of 'law and order' safety babies, but ten million of them?
tech (Score:1)