Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
AI United States

Detroit Police Chief: Facial Recognition Software Misidentifies 96% of the Time (vice.com) 80

Detroit police have used highly unreliable facial recognition technology almost exclusively against Black people so far in 2020, according to the Detroit Police Department's own statistics. From a report: The department's use of the technology gained national attention last week after the American Civil Liberties Union and New York Times brought to light the case of Robert Julian-Borchak Williams, a man who was wrongfully arrested because of the technology. In a public meeting Monday, Detroit Police Chief James Craig admitted that the technology, developed by a company called DataWorks Plus, almost never brings back a direct match and almost always misidentifies people. "If we would use the software only [to identify subjects], we would not solve the case 95-97 percent of the time," Craig said. "That's if we relied totally on the software, which would be against our current policy ... If we were just to use the technology by itself, to identify someone, I would say 96 percent of the time it would misidentify."
This discussion has been archived. No new comments can be posted.

Detroit Police Chief: Facial Recognition Software Misidentifies 96% of the Time

Comments Filter:
  • by Tailhook ( 98486 ) on Wednesday July 01, 2020 @04:50PM (#60251770)

    If it wasn't for the huge number of white Republicans running everything in Detroit I'm sure this could have been prevented.

    • Re: (Score:2, Insightful)

      by greenwow ( 3635575 )

      Exactly. They're still recovering from their last Republican mayor in 1962. It takes generations to recover from electing a Republican.

    • Comment removed based on user account deletion
  • Detroit's population is 80% black so it shouldn't be a surprise. It's also no surprise that such sensationalism is the lede and nowhere in the article does it mention the demographics of the city.

    • Also wouldn't you look at the photo of the person and the actual person and say "no it's not them"? Or are police really that stupid as to believe what they see on TV and movies as accurate, even when they keep getting false matches.

      If so I suggest the exposing themselves to radio active waste, to get super powers. Really DON'T, it was a sarcasm, you will die, I don't want anyone to die or take my statement as a serious suggestion that they or someone should. My favorite clip illustrating this is https://ww [youtube.com]

      • by malkavian ( 9512 )

        They do say in the headline that it's not their policy to rely entirely on the software (so yes, they'd be using real eyes).
        The aim of the software is to find an interesting signal that you may want to look at more closely. From the given stats, one image in 20 that it provides you for a closer look is pretty interesting (especially if you count the sheer number that it simply filters out that aren't likely of interest at all). That's the bit that's missing in the narrative; it makes it sound as though yo

        • one image in 20 that it provides you for a closer look is pretty interesting.

          Indeed. That is actually pretty good results.

          So you run the match, get 25 results. 24 of them have no connection to the victim. The 25th is the victim's ex-boyfriend with a domestic violence restraining order.

          Maybe the police can figure it out from there.

          • Re:Detroit (Score:4, Insightful)

            by Sique ( 173459 ) on Wednesday July 01, 2020 @06:48PM (#60252160) Homepage
            Actually, it's a little different.

            You have 25 cases in front of you. The computer gives you a list of 25 likely suspects, one for each case. Only in one of the cases the computer is right.

            • So... the suspect only has to commit 25 crimes and you'll get them.

              They'll all be behind bars within a couple of weeks!

              • by Sique ( 173459 )
                Why not round up all people and put them behind bars? Then you can be sure you have 100% of the perpetrators in prison.
        • Not the headline, but the summary (that's me being pedantic) but also in the summary they says:

          a man who was wrongfully arrested because of the technology

          That is not true, or at least very deceiving, it is because a police officer wrongly identified them, after the computer system that is know for misidentifying people gave a suggestion. I would be very surprised if miss-identification hasn't happened without facial recognition. The headline is designed to spark outrage and a very mundane thing.

      • Comment removed based on user account deletion
    • by rtb61 ( 674572 )

      Do you want to know what the real fault, the bullshit used to sell the system and one word missing on the results screen, that word, 'POSSIBLE' and used before match. So people will visually check what the claims, when it only claims it to be a possible match. Not that idiots will read the screen all the time but big enough it might sink it, do not just obey what the computer outputs, stop and think about it first.

    • I bet if you called them out on this they would try to play the common knowledge card. I knew Detroit was a very black city but I didn't know it was 80% black.

  • If we were just to use the technology by itself, to identify someone, I would say 96 percent of the time it would misidentify.

    Sir, If you have to actually do work to verify that the software is correct, I don't see a problem with your process. You SHOULD be manually checking the software's results before you start rounding up the usual suspects and hauling them off to jail because YOU are going to be liable for the wrongful arrest suits, not the computer. Yea the software seems to be less than marginally helpful, but that's between you and the vender you purchased it from and you should take that up with them. My guess is that

    • by egyas ( 1364223 )

      ANY software, I don't care what it is, that produces an inaccurate result 90%+ of the time, is simply worthless IMO. Would you accept a calculator that gives you the wrong answer 90%+ of the time, that you have to verify? Or an online encyclopedia? WebMD? etc, etc.

      I sure as hell wouldn't.

      • ANY software, I don't care what it is, that produces an inaccurate result 90%+ of the time, is simply worthless IMO. Would you accept a calculator that gives you the wrong answer 90%+ of the time, that you have to verify? Or an online encyclopedia? WebMD? etc, etc.

        I sure as hell wouldn't.

        SOME software solutions are not 100% deterministic, and may not always return the correct answers. A primary example is Machine Learning processes like neural nets, where sometimes a 90% or less correct rate is actually pretty good given the availability of training data. I've seen case studies where the operators where tickled pink to get a 70% detection rate, because their hand coded solution was struggling to get a 30% detection rate and was months behind. (They where detecting fraudulent bank transact

        • by egyas ( 1364223 )

          If it's a case where they have it deliver the top 10 matches, and let's ASSUME that one of those 10 is always the right person, then yes, it's still a 90% failure rate. Or, depending on how they play the numbers, it could be a "100% success rate" (of our group of 10 including the suspect).

          I took the article at face value (yes, always dangerous in this era of low-quality media reporting) to mean that 99% of the time it was "matching" the face to a person, and doing so inaccurately. As in, "Yes, comp says J

      • ANY software, I don't care what it is, that produces an inaccurate result 90%+ of the time, is simply worthless IMO.

        If you were trying to find one person out of a million, would you consider worthless getting ten suggestions, one of which would be the correct result?

        • by egyas ( 1364223 )

          If I was looking for one person in a million, and the system is wrong 98% of the time as the chief said, then that software would give me 980,00 incorrect answers. lol

          Really though, I get the point. I just think that something designed to find WHO a person is, isn't worth the cost, etc when it's wrong 98% of the time., Assuming his numbers are correct.

          • But in that case it would misidentify a person 99.999% of the time. Or maybe I misunderstood what "misidentify someone" meant? It's not clear from the article what specific type of error it refers to.
      • by xonen ( 774419 )

        Would you accept a calculator that gives you the wrong answer 90%+ of the time, that you have to verify

        Type on calculator: 1 / 3 * 3 =. And verify your result is correct.

        Type in your calculator some math with very small or large numbers, let's say you want to calculate the RC constant of an electronic circuit and have values in the nano, micro and mili scale. Will you not check your calculator if the order of magnitude of the number comes out is correct?
        Or in general, when you make a calculation on a calculator, do a ballpark calculation in your head, so you can spot (typing) errors, which are easy to make a

        • by Immerman ( 2627577 ) on Wednesday July 01, 2020 @09:05PM (#60252432)

          *If* the software gives you dozens of "possible matches" that probably includes the real match, then yes, it's potentially very useful. Like an incompetent intern that searches through the mug shot database looking for possible matches - you absolutely shouldn't trust their work, but they can still save you a lot of effort.

          If it gives you one or two matches which are probably wrong, it's worse than useless. Which it does depends entirely on the software - do they mention it anywhere?

          If it gives you dozens of possible matches, and officers usually just assume the first match is correct... it's still worse than useless. Through no fault of its own, but if cops prove chronically incapable of using their tools correctly, they should have them taken away. The same goes for rubber bullets (which you're supposed to fire at the ground, not people's faces), flash-bang grenades (which are designed to temporarily render victims mostly deaf, blind, and confused - and thus incapable of following orders), no-knock raids (which are designed to make targets panic so that they're easier to kill, and should never be used unless that's the goal), etc., etc., etc.

          There's an appallingly long list of tools cops seem incapable of using correctly, and citizens end up paying the price. If they can't use the tools correctly, I say take them away.

  • by Kisai ( 213879 ) on Wednesday July 01, 2020 @05:00PM (#60251818)

    And the reason why they use it at all is?

    Facial recognition is really only good for identifying white people, and not just any white people, white men with short hair (since men rarely change their hair style.) Facial recognition is only good about picking out unique features, and since white skin will show every scar, mole, tattoo or skin imperfection it makes it easy to identify one white person from another.

    People who have tattoos anywhere near their face are easily identified, it doesn't need to be facial recognition. This is why there's this negative bias towards getting tattoos in white cultures, as it's often seen as being involved in a cult/gang, and it sticks out significantly. That doesn't make having tattoos bad, but it makes you easily identified so you have to keep your nose clean a lot harder than someone who has none to avoid being the target in a group of otherwise unidentifiable people involved in mischief.

  • Tool (Score:2, Interesting)

    by unixcorn ( 120825 )

    I had some preconceived notions before reading the article based on the title and summary. However, after reading, I can see that the facial recognition software is not an end-all. It's simply a tool for detectives to use while working on cases. They know the failure percentages, but the daily reporting, which is digested by the media, makes it appear that people are instantaneously arrested if there is a match. That's simply not the case.

    • by ttyler ( 20687 )

      They know the failure percentages, but the daily reporting, which is digested by the media, makes it appear that people are instantaneously arrested if there is a match. That's simply not the case.

      Actually, that is EXACTLY the case. https://www.npr.org/2020/06/24... [npr.org]

    • Re: Tool (Score:5, Insightful)

      by hey! ( 33014 ) on Wednesday July 01, 2020 @05:39PM (#60251960) Homepage Journal

      but the daily reporting, which is digested by the media, makes it appear that people are instantaneously arrested if there is a match. That's simply not the case.

      Except we're talking about this because the police did exactly that [nytimes.com].

      It clearly wasn't what the police were *supposed* to do. It said clearly on the printout that the match was not reliable enough to use as a basis for an arrest. But they did it anyways, like many other things individual cops do that they're not supposed to: racially profile drivers, planting drugs [usatoday.com], or putting their knee on the neck of a non-resisting suspect.

      The number one thing we need isn't better police rules, it's better police. The problem is not enough people are willing to do the things you'd have to do to get a better police: raise hiring and training standards, and hold officers accountable for non-professional conduct.

      While we're at it, we should look at disciplining the judge who issued the warrant based on a computer printout that said in plain English that it was not probable cause. Going to a judge for a warrant is supposed to act on a check on police overreach.

      • Comment removed based on user account deletion
      • The first sentence from your link:

        "In what may be the first known case of its kind, a faulty facial recognition match led to ..."

        So yeah that happened once. Somebody did something really dumb. Sometimes I do really dumb stuff.

        > The number one thing we need isn't better police rules, it's better police. The problem is not enough people are willing to do the things you'd have to do to get a better police: raise hiring and training standards, and hold officers accountable for non-professional conduct.

        Agree

      • by swell ( 195815 )

        hey! says:
        "The number one thing we need isn't better police rules, it's better police."

        Right. But the problem is that anyone who wants to be a cop should probably be disqualified. In a larger sense, anyone who seeks power over other people is suspect. So that includes cops, preachers, teachers, judges, politicians, CEOs, and people with mod points. They need careful scrutiny and a short leash.

        • by hey! ( 33014 )

          That's really not true. I have two nephews (on different sides of the family) that changed careers to cop out of an interest in public service. One had moved to New Orleans after Katrina to work for Habitat for Humanity post-Katrina; the other was an inner city middle school guidance counselor.

          The thing is the culture isn't supportive of guys like that.

      • by uncqual ( 836337 )

        Did the judge see the print out? A request for a warrant contains statements that are made under penalty of perjury and the correctness of those statements is not necessarily verified by the judge. The judge, for example, does not verify the BAC reading stated in an arrest warrant by running their own tests - they rely on the sworn statements of the requestor. Nor do they look at a DNA match to verify the lab reported it correctly, they rely on the lab - and possibly just on a statement by the requestor tha

    • by g01d4 ( 888748 )

      That's simply not the case.

      Assuming it's not the case and given the purported failure percentages is this really a good use of resources? The de-fund the police campaign is more about law enforcement as a government bureaucracy with so called people-of-color getting tied in due to their correlation with urban poverty. There's some irony when the left is wont to paraphrase RR: "The nine most terrifying words in the English language are: I'm from the [police] and I'm here to help."

  • Is the software programmed to just give the best match like on a CSI TV show, or do it allow the police to see all the similar results?
    Because on the case of B the tool would be quite more useful.
    It gives you 30 faces, then you do the final recognition and well use other factors to exclude similar looking people.

    • The fingerprint software just gives the best match, no matter what. Which is how a lawyer in Oregon got arrested for the Train bombing in Spain back in the oughts. (He was noteactually involved.)

      In terms of buck passing, having the computer tell you who to arrest and like the safest bet. Getting a list of 12 means you're part owner of the decision.

      • by Z80a ( 971949 )

        Seems like a great way to botch investigations.

      • by Kaenneth ( 82978 )

        Yeah, it should work like a line-up, the software should be programmed to NEVER give only one match, because that gives bias and false confidence. "The computer says it's him."

        Forcing them to actually look and compare with their human eyes for the final result is safer for the public.

        • The software may be able to show any number of matches. The problem in the Oregon case was that the "best" match was not very good. This is sort of the opposite of a line-up, where you have one actual suspect and several randos: the software gives you several randos who become suspects.
  • If its so bad then why are you paying for it?

    • by malkavian ( 9512 )

      Because it vastly cuts down on images to identify. You have a crowd of say 10,000 passing through a given point.
      This software would churn out maybe 100 shots of people it thought you may be interesting in checking against criminal records, as it's flagging a "this looks interesting" condition.
      What happens then, is that someone goes over the info presented, has a good look at the shot, and a mugshot and goes "Does this look like the right person?". Mainly the answer is no, that would be silly, so you disca

  • by RyanFenton ( 230700 ) on Wednesday July 01, 2020 @05:22PM (#60251888)

    If you're at a small business, and using one of those hand-scanny wall things - with the claims of blood vessel detection and whatnot - sure, it'll work fine most of the time.

    But it only 'works' because of how lossy it has been programmed to be. Too much required precision, and someone starts doing too many push-ups and gets a few new blood paths, and it invalidates their test.

    All biometrics testing has a large bullshit layer to it - it's all loose heuristics with a layer of technobabble on top for sales conferences and websites - and again, that's fine most of the time - but none of it is terribly good alone as a security apparatus.

    Scale up enough - it all breaks beyond a small office. That same lossy nature of almost all of it ruins it's effect - even the "like a fingerprint" analogy is garbage - fingerprints get mixed up in identification all the time, and the heuristics just don't scale very far like they're some unique ID.

    If you ever want a fun read, read anything about the life of Richard Feynman - young physicist at the time who worked on the nuclear bombs in WW2 - a 'highly secure' position if there ever was one.

    Not really a security engineer or anything - he constantly got through locks and safes and security measures just as a self-taught layman, just to show them how insecure their methods were. Definitely a smart-ass, but he had the goods at a time when knowledge and evidence could pay for that.

    And things really haven't improved - just the illusion of securty has been less challenged and made more punishing as time went. Most 'security' doors aren't even installed into the doorway correct, and can have the latch slipped with almost no effort or skill. Most locks sold have giant bypasses that don't even require picking or security tools - just a path right through the lock where you can press on the locking mechanism itself with an easy shim.

    The biometrics stuff is just the bullshit icing on the corruption cake. It's flawed, to be absolutely sure - but like all of it, it's there to shape 'good' behavior, not prevent anything much beyond that.

    Ryan Fenton

  • If your tool has a 96% failure rate then you shouldn't ever use it ever again.

    If your tool has a 50% failure rate, you can flip a coin and get the same results.

    If your tool has a 96% success rate, you're still going to be wrong 4% of the time.
    Of the 45,000 crime reports in Detroit last year, 4% is over 2,000 people mis-identified.

    This is not Philip K. Dick's Minority Report. This is an utter failure, and police departments everywhere should clamor for a refund.

    E

    • by malkavian ( 9512 )

      The question is, is this a 96% failure, or a 96% false positive?
      If a tool has a 99% filter, with a 96% false positive, that reduces the figure of 'identifies via this path to 450 out of 45,500. of these, circa 20 will be cases that it has spot on. It's easier to resource 450 cases on a fast track than balance 45,000 that you don't have a clue where to start (and these will go the standard route).

    • Comment removed based on user account deletion
    • by Kjella ( 173770 )

      If your tool has a 50% failure rate, you can flip a coin and get the same results.

      That's not how it works at all. If you got four people in a line-up there's a 1/4 = 25% chance you'll find the right person by coin toss. If I can make a tool that's right half the time I've doubled that. Along the way we're redefined the question from how hard the underlying problem is to how successful we are at it.

      If I could make a machine that'd give me the lottery jackpot numbers half the time I'd be ecstatic. Sure, I'd only win every other week but given that it's one set of winning numbers and hundre

  • by SirAstral ( 1349985 ) on Wednesday July 01, 2020 @05:33PM (#60251946)

    A lot of people mistake what a 95% match chance means.

    95% means that is the chance it will match with something... not necessarily that it would match the CORRECT something.

    But for Devils Advocacy lets just say that is a 95% correct match... incoming example... say you have 100 people... a 95% match rate means that it should correctly match 95 of the 100... but think about it for a minute... that is a correct match 95% of the time... how is this bad? Well think about it like this instead. Its more like... 95% of the time the match is right, but that 5% of the time a match is not correct and the police now thing you are the bad guy (you really look a lot like them after all) but you are innocent. That means for every crime 5 people of the hundred are at a risk of being falsely implicated by this technology. Now adjust that for 100k people. That now means there is now 5,000 people that are not going to match correctly, and if anyone know about statistics... this means that in order for facial recognition to really be good, it has to get to 99.999% accuracy to make the number of false positives manageable for detectives, because we already know that if 5,000 match come back, they don't have the resources to interview them all!

    Now... just imagine if you have a million souls in your database? The more you have to match against... the higher the precision has to be... and it might shock you but you have more than 1 doppelganger out there! There are a lot of people look a lot alike and you would not quickly notice the differences until you see them side by side!

    • by Ksevio ( 865461 )

      Depends - If the software returns 25 matches for a face, then 96% would be an excellent match rate

    • by AmiMoJo ( 196126 )

      It wouldn't be nearly as bad if the police didn't go and arrest the false matches, often using violence.

      An often overlooked problem with new forensic techniques is that they make the police lazy. They don't bother to check, if the computer says it's a match the send some uniforms to go make an arrest.

  • Problem 1: There's a theory that about one in a million people look so similar to you, that your friends and family wouldn't be able to tell you apart if you stood side by side. Closer than the average identical twin. So if your photo database is big enough (a million people), and you have a photo of a criminal, you will find someone in your database who is an exact match. If the criminal is in the database, you will find two :-) Since photos of black people are notoriously difficult, it may be one in 100,0
  • by rsilvergun ( 571051 ) on Wednesday July 01, 2020 @05:42PM (#60251972)
    of a drug sniffing dog. It's just there to establish probably cause so the cops can poke their nose where they don't belong.

    End the drug war, fund social services properly so that we're not over funding cops to do wellness checks and psychotherapy, do a universal housing program to end homelessness and these police state problems go away.

    If you don't want to pay/do all that then you better get comfy with a militarized police state. Because that's your only other option.
  • That usually fixes it.
  • by mveloso ( 325617 ) on Wednesday July 01, 2020 @07:06PM (#60252212)

    "In the group without any actors, 32% of participants gave incorrect statements – which was put down to factors such as poor eyesight and memory. But when actors were planted in the group, 52% of the “real” participants gave an incorrect statement. And worryingly, when more than two actors were planted in a group, almost 80% of the participants ended up giving the same incorrect statement and identifying an innocent man as the culprit."

    https://theconversation.com/ne... [theconversation.com]

    https://www.pnas.org/content/1... [pnas.org]

  • "Detroit police have used highly unreliable facial recognition technology almost exclusively against Black people so far in 2020"

    I'm sure they did, then again the D is like 77% Afro American.

    I'm sure they will be sued for this - again.

    Btw: Chief Craig seems to be a decent cop for what it's worth.
  • If you are out in public you should be wearing a mask.

  • He is an idiot (Score:3, Insightful)

    by groobly ( 6155920 ) on Thursday July 02, 2020 @12:15PM (#60254346)
    Police chief is an idiot. For starters, apparently he never got past high school, but is an expert on facial recognition now. Most likely the problem is that most perps are not already in the mug shot database, so that when the software picks the closest n candidates, none of them is the actual perp. So what?

    The real problem is that ignoramuses like this police chief buy into technology they do not understand that is sold to them by other ignoramuses -- though a little smarter -- who also don't understand the technology. Kakistocracy at its finest.
  • It'll correctly identify most of you, esp. the supporters of the Orange Hairball, since you're white.

  • How this passed quality assurance will likely remain a mystery. The chief himself may have bought it from a political friend and thought he got a good deal. It's easy for him now in the current climate to reveal the crappy tech and to play the innocent party, but I don't buy it. Whatever it was, there must have been some very dodgy business been going on for this to end up in a police department.

Any circuit design must contain at least one part which is obsolete, two parts which are unobtainable, and three parts which are still under development.

Working...