Detroit Police Chief: Facial Recognition Software Misidentifies 96% of the Time (vice.com) 80
Detroit police have used highly unreliable facial recognition technology almost exclusively against Black people so far in 2020, according to the Detroit Police Department's own statistics. From a report: The department's use of the technology gained national attention last week after the American Civil Liberties Union and New York Times brought to light the case of Robert Julian-Borchak Williams, a man who was wrongfully arrested because of the technology. In a public meeting Monday, Detroit Police Chief James Craig admitted that the technology, developed by a company called DataWorks Plus, almost never brings back a direct match and almost always misidentifies people. "If we would use the software only [to identify subjects], we would not solve the case 95-97 percent of the time," Craig said. "That's if we relied totally on the software, which would be against our current policy ... If we were just to use the technology by itself, to identify someone, I would say 96 percent of the time it would misidentify."
Maybe they should vote Democrat (Score:3, Insightful)
If it wasn't for the huge number of white Republicans running everything in Detroit I'm sure this could have been prevented.
Proportionnal to population. (Score:5, Insightful)
The population of Detroit is 80% black. {...} crime {...} mostly committed by blacks.
Yeah, and if Detroit had a high percentage of Martians in its population, a high percentage of crime would be committed by Martians.
And the police would be complaining that their surveillance camera's AI trained on human faces would also give crappy results in a population mostly full of Martians.
What was your point already ?
Re: (Score:1)
Re: (Score:1)
Retraining a device is not such a tremendous hurdle after you have built the entire training pipeline. The reason dark people are harder to identify is *gasp* that they are dark. This obscures finer details, such as can be found in the face and used to delineate different components of the face and thus separate one person from another.
The reason black people were classified as gorillas was not that the classifier was racist, it was for the very simple reason that fewer delineating features were identified
Re: (Score:2)
Just use a damn thermal camera. Their skin will glow just like anyone else's.
The only people that wont show up are cold blooded reptile people. (and since those aren't real, there wont be a real problem.)
Re: (Score:1)
Not to mention that African features tend to not have much variance in them, and most black men keep their hair very short. It's a hard to swallow pill but it's hard to visually tell black people apart in many cases unless they have a very distinctive habit like how Will Smith has that perpetual smile on his face.
Re: (Score:2, Insightful)
Exactly. They're still recovering from their last Republican mayor in 1962. It takes generations to recover from electing a Republican.
Re: (Score:2)
Re: (Score:2)
Detroit (Score:1)
Detroit's population is 80% black so it shouldn't be a surprise. It's also no surprise that such sensationalism is the lede and nowhere in the article does it mention the demographics of the city.
Re: (Score:2)
Also wouldn't you look at the photo of the person and the actual person and say "no it's not them"? Or are police really that stupid as to believe what they see on TV and movies as accurate, even when they keep getting false matches.
If so I suggest the exposing themselves to radio active waste, to get super powers. Really DON'T, it was a sarcasm, you will die, I don't want anyone to die or take my statement as a serious suggestion that they or someone should. My favorite clip illustrating this is https://ww [youtube.com]
Re: (Score:3)
They do say in the headline that it's not their policy to rely entirely on the software (so yes, they'd be using real eyes).
The aim of the software is to find an interesting signal that you may want to look at more closely. From the given stats, one image in 20 that it provides you for a closer look is pretty interesting (especially if you count the sheer number that it simply filters out that aren't likely of interest at all). That's the bit that's missing in the narrative; it makes it sound as though yo
Re: (Score:2)
one image in 20 that it provides you for a closer look is pretty interesting.
Indeed. That is actually pretty good results.
So you run the match, get 25 results. 24 of them have no connection to the victim. The 25th is the victim's ex-boyfriend with a domestic violence restraining order.
Maybe the police can figure it out from there.
Re:Detroit (Score:4, Insightful)
You have 25 cases in front of you. The computer gives you a list of 25 likely suspects, one for each case. Only in one of the cases the computer is right.
Re: (Score:2)
So... the suspect only has to commit 25 crimes and you'll get them.
They'll all be behind bars within a couple of weeks!
Re: (Score:2)
Re: (Score:2)
Not the headline, but the summary (that's me being pedantic) but also in the summary they says:
a man who was wrongfully arrested because of the technology
That is not true, or at least very deceiving, it is because a police officer wrongly identified them, after the computer system that is know for misidentifying people gave a suggestion. I would be very surprised if miss-identification hasn't happened without facial recognition. The headline is designed to spark outrage and a very mundane thing.
Re: (Score:2)
Re: (Score:2)
Do you want to know what the real fault, the bullshit used to sell the system and one word missing on the results screen, that word, 'POSSIBLE' and used before match. So people will visually check what the claims, when it only claims it to be a possible match. Not that idiots will read the screen all the time but big enough it might sink it, do not just obey what the computer outputs, stop and think about it first.
Re: (Score:2)
I bet if you called them out on this they would try to play the common knowledge card. I knew Detroit was a very black city but I didn't know it was 80% black.
Antidotal evidence? (Score:2)
If we were just to use the technology by itself, to identify someone, I would say 96 percent of the time it would misidentify.
Sir, If you have to actually do work to verify that the software is correct, I don't see a problem with your process. You SHOULD be manually checking the software's results before you start rounding up the usual suspects and hauling them off to jail because YOU are going to be liable for the wrongful arrest suits, not the computer. Yea the software seems to be less than marginally helpful, but that's between you and the vender you purchased it from and you should take that up with them. My guess is that
Re: (Score:2)
ANY software, I don't care what it is, that produces an inaccurate result 90%+ of the time, is simply worthless IMO. Would you accept a calculator that gives you the wrong answer 90%+ of the time, that you have to verify? Or an online encyclopedia? WebMD? etc, etc.
I sure as hell wouldn't.
Re: (Score:2)
ANY software, I don't care what it is, that produces an inaccurate result 90%+ of the time, is simply worthless IMO. Would you accept a calculator that gives you the wrong answer 90%+ of the time, that you have to verify? Or an online encyclopedia? WebMD? etc, etc.
I sure as hell wouldn't.
SOME software solutions are not 100% deterministic, and may not always return the correct answers. A primary example is Machine Learning processes like neural nets, where sometimes a 90% or less correct rate is actually pretty good given the availability of training data. I've seen case studies where the operators where tickled pink to get a 70% detection rate, because their hand coded solution was struggling to get a 30% detection rate and was months behind. (They where detecting fraudulent bank transact
Re: (Score:1)
If it's a case where they have it deliver the top 10 matches, and let's ASSUME that one of those 10 is always the right person, then yes, it's still a 90% failure rate. Or, depending on how they play the numbers, it could be a "100% success rate" (of our group of 10 including the suspect).
I took the article at face value (yes, always dangerous in this era of low-quality media reporting) to mean that 99% of the time it was "matching" the face to a person, and doing so inaccurately. As in, "Yes, comp says J
Re: (Score:2)
ANY software, I don't care what it is, that produces an inaccurate result 90%+ of the time, is simply worthless IMO.
If you were trying to find one person out of a million, would you consider worthless getting ten suggestions, one of which would be the correct result?
Re: (Score:1)
If I was looking for one person in a million, and the system is wrong 98% of the time as the chief said, then that software would give me 980,00 incorrect answers. lol
Really though, I get the point. I just think that something designed to find WHO a person is, isn't worth the cost, etc when it's wrong 98% of the time., Assuming his numbers are correct.
Re: (Score:2)
Re: (Score:3)
Would you accept a calculator that gives you the wrong answer 90%+ of the time, that you have to verify
Type on calculator: 1 / 3 * 3 =. And verify your result is correct.
Type in your calculator some math with very small or large numbers, let's say you want to calculate the RC constant of an electronic circuit and have values in the nano, micro and mili scale. Will you not check your calculator if the order of magnitude of the number comes out is correct?
Or in general, when you make a calculation on a calculator, do a ballpark calculation in your head, so you can spot (typing) errors, which are easy to make a
Re:Antidotal evidence? (Score:5, Interesting)
*If* the software gives you dozens of "possible matches" that probably includes the real match, then yes, it's potentially very useful. Like an incompetent intern that searches through the mug shot database looking for possible matches - you absolutely shouldn't trust their work, but they can still save you a lot of effort.
If it gives you one or two matches which are probably wrong, it's worse than useless. Which it does depends entirely on the software - do they mention it anywhere?
If it gives you dozens of possible matches, and officers usually just assume the first match is correct... it's still worse than useless. Through no fault of its own, but if cops prove chronically incapable of using their tools correctly, they should have them taken away. The same goes for rubber bullets (which you're supposed to fire at the ground, not people's faces), flash-bang grenades (which are designed to temporarily render victims mostly deaf, blind, and confused - and thus incapable of following orders), no-knock raids (which are designed to make targets panic so that they're easier to kill, and should never be used unless that's the goal), etc., etc., etc.
There's an appallingly long list of tools cops seem incapable of using correctly, and citizens end up paying the price. If they can't use the tools correctly, I say take them away.
Re: (Score:2)
Uh... phrasing?
Wow..Evidence all black people look alike , rofl (Score:1, Troll)
And? (Score:3)
And the reason why they use it at all is?
Facial recognition is really only good for identifying white people, and not just any white people, white men with short hair (since men rarely change their hair style.) Facial recognition is only good about picking out unique features, and since white skin will show every scar, mole, tattoo or skin imperfection it makes it easy to identify one white person from another.
People who have tattoos anywhere near their face are easily identified, it doesn't need to be facial recognition. This is why there's this negative bias towards getting tattoos in white cultures, as it's often seen as being involved in a cult/gang, and it sticks out significantly. That doesn't make having tattoos bad, but it makes you easily identified so you have to keep your nose clean a lot harder than someone who has none to avoid being the target in a group of otherwise unidentifiable people involved in mischief.
Tool (Score:2, Interesting)
I had some preconceived notions before reading the article based on the title and summary. However, after reading, I can see that the facial recognition software is not an end-all. It's simply a tool for detectives to use while working on cases. They know the failure percentages, but the daily reporting, which is digested by the media, makes it appear that people are instantaneously arrested if there is a match. That's simply not the case.
Re: (Score:2)
They know the failure percentages, but the daily reporting, which is digested by the media, makes it appear that people are instantaneously arrested if there is a match. That's simply not the case.
Actually, that is EXACTLY the case. https://www.npr.org/2020/06/24... [npr.org]
Re: (Score:2)
Re: Tool (Score:5, Insightful)
but the daily reporting, which is digested by the media, makes it appear that people are instantaneously arrested if there is a match. That's simply not the case.
Except we're talking about this because the police did exactly that [nytimes.com].
It clearly wasn't what the police were *supposed* to do. It said clearly on the printout that the match was not reliable enough to use as a basis for an arrest. But they did it anyways, like many other things individual cops do that they're not supposed to: racially profile drivers, planting drugs [usatoday.com], or putting their knee on the neck of a non-resisting suspect.
The number one thing we need isn't better police rules, it's better police. The problem is not enough people are willing to do the things you'd have to do to get a better police: raise hiring and training standards, and hold officers accountable for non-professional conduct.
While we're at it, we should look at disciplining the judge who issued the warrant based on a computer printout that said in plain English that it was not probable cause. Going to a judge for a warrant is supposed to act on a check on police overreach.
Re: (Score:2)
Once (Score:3)
The first sentence from your link:
"In what may be the first known case of its kind, a faulty facial recognition match led to ..."
So yeah that happened once. Somebody did something really dumb. Sometimes I do really dumb stuff.
> The number one thing we need isn't better police rules, it's better police. The problem is not enough people are willing to do the things you'd have to do to get a better police: raise hiring and training standards, and hold officers accountable for non-professional conduct.
Agree
Re: (Score:2)
hey! says:
"The number one thing we need isn't better police rules, it's better police."
Right. But the problem is that anyone who wants to be a cop should probably be disqualified. In a larger sense, anyone who seeks power over other people is suspect. So that includes cops, preachers, teachers, judges, politicians, CEOs, and people with mod points. They need careful scrutiny and a short leash.
Re: (Score:3)
That's really not true. I have two nephews (on different sides of the family) that changed careers to cop out of an interest in public service. One had moved to New Orleans after Katrina to work for Habitat for Humanity post-Katrina; the other was an inner city middle school guidance counselor.
The thing is the culture isn't supportive of guys like that.
Re: (Score:2)
Did the judge see the print out? A request for a warrant contains statements that are made under penalty of perjury and the correctness of those statements is not necessarily verified by the judge. The judge, for example, does not verify the BAC reading stated in an arrest warrant by running their own tests - they rely on the sworn statements of the requestor. Nor do they look at a DNA match to verify the lab reported it correctly, they rely on the lab - and possibly just on a statement by the requestor tha
Re: (Score:2)
Assuming it's not the case and given the purported failure percentages is this really a good use of resources? The de-fund the police campaign is more about law enforcement as a government bureaucracy with so called people-of-color getting tied in due to their correlation with urban poverty. There's some irony when the left is wont to paraphrase RR: "The nine most terrifying words in the English language are: I'm from the [police] and I'm here to help."
"The guy" (Score:2)
Is the software programmed to just give the best match like on a CSI TV show, or do it allow the police to see all the similar results?
Because on the case of B the tool would be quite more useful.
It gives you 30 faces, then you do the final recognition and well use other factors to exclude similar looking people.
Re: (Score:3)
In terms of buck passing, having the computer tell you who to arrest and like the safest bet. Getting a list of 12 means you're part owner of the decision.
Re: (Score:2)
Seems like a great way to botch investigations.
Re: (Score:2)
Yeah, it should work like a line-up, the software should be programmed to NEVER give only one match, because that gives bias and false confidence. "The computer says it's him."
Forcing them to actually look and compare with their human eyes for the final result is safer for the public.
Re: (Score:2)
Why use it? (Score:2)
If its so bad then why are you paying for it?
Re: (Score:3)
Because it vastly cuts down on images to identify. You have a crowd of say 10,000 passing through a given point.
This software would churn out maybe 100 shots of people it thought you may be interesting in checking against criminal records, as it's flagging a "this looks interesting" condition.
What happens then, is that someone goes over the info presented, has a good look at the shot, and a mugshot and goes "Does this look like the right person?". Mainly the answer is no, that would be silly, so you disca
Same with most biometrics at scale... (Score:5, Interesting)
If you're at a small business, and using one of those hand-scanny wall things - with the claims of blood vessel detection and whatnot - sure, it'll work fine most of the time.
But it only 'works' because of how lossy it has been programmed to be. Too much required precision, and someone starts doing too many push-ups and gets a few new blood paths, and it invalidates their test.
All biometrics testing has a large bullshit layer to it - it's all loose heuristics with a layer of technobabble on top for sales conferences and websites - and again, that's fine most of the time - but none of it is terribly good alone as a security apparatus.
Scale up enough - it all breaks beyond a small office. That same lossy nature of almost all of it ruins it's effect - even the "like a fingerprint" analogy is garbage - fingerprints get mixed up in identification all the time, and the heuristics just don't scale very far like they're some unique ID.
If you ever want a fun read, read anything about the life of Richard Feynman - young physicist at the time who worked on the nuclear bombs in WW2 - a 'highly secure' position if there ever was one.
Not really a security engineer or anything - he constantly got through locks and safes and security measures just as a self-taught layman, just to show them how insecure their methods were. Definitely a smart-ass, but he had the goods at a time when knowledge and evidence could pay for that.
And things really haven't improved - just the illusion of securty has been less challenged and made more punishing as time went. Most 'security' doors aren't even installed into the doorway correct, and can have the latch slipped with almost no effort or skill. Most locks sold have giant bypasses that don't even require picking or security tools - just a path right through the lock where you can press on the locking mechanism itself with an easy shim.
The biometrics stuff is just the bullshit icing on the corruption cake. It's flawed, to be absolutely sure - but like all of it, it's there to shape 'good' behavior, not prevent anything much beyond that.
Ryan Fenton
Re:Same with most biometrics at scale... (Score:4, Interesting)
Look, facial recognition software is still software. If there's one thing all of us should be aware of, it's that software is usually crap. Trusting software, especially a new field of software, is just asking for it.
Statistics and damn lies (Score:2)
If your tool has a 96% failure rate then you shouldn't ever use it ever again.
If your tool has a 50% failure rate, you can flip a coin and get the same results.
If your tool has a 96% success rate, you're still going to be wrong 4% of the time.
Of the 45,000 crime reports in Detroit last year, 4% is over 2,000 people mis-identified.
This is not Philip K. Dick's Minority Report. This is an utter failure, and police departments everywhere should clamor for a refund.
E
Re: (Score:2)
The question is, is this a 96% failure, or a 96% false positive?
If a tool has a 99% filter, with a 96% false positive, that reduces the figure of 'identifies via this path to 450 out of 45,500. of these, circa 20 will be cases that it has spot on. It's easier to resource 450 cases on a fast track than balance 45,000 that you don't have a clue where to start (and these will go the standard route).
Re: (Score:2)
Re: (Score:2)
If your tool has a 50% failure rate, you can flip a coin and get the same results.
That's not how it works at all. If you got four people in a line-up there's a 1/4 = 25% chance you'll find the right person by coin toss. If I can make a tool that's right half the time I've doubled that. Along the way we're redefined the question from how hard the underlying problem is to how successful we are at it.
If I could make a machine that'd give me the lottery jackpot numbers half the time I'd be ecstatic. Sure, I'd only win every other week but given that it's one set of winning numbers and hundre
Thats how it works.... Statistics! (Score:3)
A lot of people mistake what a 95% match chance means.
95% means that is the chance it will match with something... not necessarily that it would match the CORRECT something.
But for Devils Advocacy lets just say that is a 95% correct match... incoming example... say you have 100 people... a 95% match rate means that it should correctly match 95 of the 100... but think about it for a minute... that is a correct match 95% of the time... how is this bad? Well think about it like this instead. Its more like... 95% of the time the match is right, but that 5% of the time a match is not correct and the police now thing you are the bad guy (you really look a lot like them after all) but you are innocent. That means for every crime 5 people of the hundred are at a risk of being falsely implicated by this technology. Now adjust that for 100k people. That now means there is now 5,000 people that are not going to match correctly, and if anyone know about statistics... this means that in order for facial recognition to really be good, it has to get to 99.999% accuracy to make the number of false positives manageable for detectives, because we already know that if 5,000 match come back, they don't have the resources to interview them all!
Now... just imagine if you have a million souls in your database? The more you have to match against... the higher the precision has to be... and it might shock you but you have more than 1 doppelganger out there! There are a lot of people look a lot alike and you would not quickly notice the differences until you see them side by side!
Re: (Score:2)
Depends - If the software returns 25 matches for a face, then 96% would be an excellent match rate
Re: (Score:2)
It wouldn't be nearly as bad if the police didn't go and arrest the false matches, often using violence.
An often overlooked problem with new forensic techniques is that they make the police lazy. They don't bother to check, if the computer says it's a match the send some uniforms to go make an arrest.
Lots of problems (Score:2)
It's just the 21st Century equivalent (Score:3)
End the drug war, fund social services properly so that we're not over funding cops to do wellness checks and psychotherapy, do a universal housing program to end homelessness and these police state problems go away.
If you don't want to pay/do all that then you better get comfy with a militarized police state. Because that's your only other option.
They forgot to say "enhance". (Score:1)
Eyewitnesses suck too! More witnesses = more suck (Score:5, Informative)
"In the group without any actors, 32% of participants gave incorrect statements – which was put down to factors such as poor eyesight and memory. But when actors were planted in the group, 52% of the “real” participants gave an incorrect statement. And worryingly, when more than two actors were planted in a group, almost 80% of the participants ended up giving the same incorrect statement and identifying an innocent man as the culprit."
https://theconversation.com/ne... [theconversation.com]
https://www.pnas.org/content/1... [pnas.org]
Percentages (Score:2)
I'm sure they did, then again the D is like 77% Afro American.
I'm sure they will be sued for this - again.
Btw: Chief Craig seems to be a decent cop for what it's worth.
Whats the point of facial recognition (Score:2)
If you are out in public you should be wearing a mask.
He is an idiot (Score:3, Insightful)
The real problem is that ignoramuses like this police chief buy into technology they do not understand that is sold to them by other ignoramuses -- though a little smarter -- who also don't understand the technology. Kakistocracy at its finest.
Don't worry, slashdotters (Score:2)
It'll correctly identify most of you, esp. the supporters of the Orange Hairball, since you're white.
Where was QA?! (Score:2)
How this passed quality assurance will likely remain a mystery. The chief himself may have bought it from a political friend and thought he got a good deal. It's easy for him now in the current climate to reveal the crappy tech and to play the innocent party, but I don't buy it. Whatever it was, there must have been some very dodgy business been going on for this to end up in a police department.