Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
United Kingdom Crime Privacy Software Hardware Technology

UK Police Say 92 Percent False Positive Facial Recognition Is No Big Deal (arstechnica.com) 189

An anonymous reader quotes a report from Ars Technica: A British police agency is defending its use of facial recognition technology at the June 2017 Champions League soccer final in Cardiff, Wales -- among several other instances -- saying that despite the system having a 92-percent false positive rate, "no one" has ever been arrested due to such an error. New data about the South Wales Police's use of the technology obtained by Wired UK and The Guardian through a public records request shows that of the 2,470 alerts from the facial recognition system, 2,297 were false positives. In other words, nine out of 10 times, the system erroneously flagged someone as being suspicious or worthy of arrest.

In a public statement, the SWP said that it has arrested "over 450" people as a result of its facial recognition efforts over the last nine months. "Of course, no facial recognition system is 100 percent accurate under all conditions. Technical issues are normal to all face recognition systems, which means false positives will continue to be a common problem for the foreseeable future," the police wrote. "However, since we introduced the facial recognition technology, no individual has been arrested where a false positive alert has led to an intervention and no members of the public have complained." The agency added that it is "very cognizant of concerns about privacy, and we have built in checks and balances into our methodology to make sure our approach is justified and balanced."

This discussion has been archived. No new comments can be posted.

UK Police Say 92 Percent False Positive Facial Recognition Is No Big Deal

Comments Filter:
  • by hcs_$reboot ( 1536101 ) on Monday May 07, 2018 @05:43PM (#56570310)
    Rate of 8% successful, meaning almost 1 in 10 people are correctly identified. Not that bad.
    • by ShanghaiBill ( 739463 ) on Monday May 07, 2018 @05:57PM (#56570388)

      Rate of 8% successful, meaning almost 1 in 10 people are correctly identified. Not that bad.

      Indeed. If you are looking for a suspect in a city of a million people, and this system flags 10 people, and upon double checking you find that one of the ten is the suspect, then that is pretty darn good.

      The false positive rate, by itself, tells you nothing about the usefulness of a test.

      • by Anonymous Coward

        Rate of 8% successful, meaning almost 1 in 10 people are correctly identified. Not that bad.

        Indeed. If you are looking for a suspect in a city of a million people, and this system flags 10 people, and upon double checking you find that one of the ten is the suspect, then that is pretty darn good.

        The false positive rate, by itself, tells you nothing about the usefulness of a test.

        The real problem is that most violent criminals are black and facial recognition has a harder time with black faces because of lower contrast. Check the crime stats for the US and any other nation with a significant black minority population. Blacks commit violent crimes far in excess of their percentage of the population. This includes nations like Sweden which never had slavery or Jim Crow. These are facts.

        • The real problem is that most violent criminals are black and facial recognition has a harder time with black faces because of lower contrast.

          I'm pretty sure that a lot of people will be happy with this anti-profiling affirmative action.

        • by jez9999 ( 618189 )

          Also, British politics is extremely PC, so burkas are allowed. Which means criminals just need to dress up in a black tent and nobody can make them show their face.

        • Wrong. We recently had an article here that informed us it's not contrast or training data, it's racist white male programmers who are to blame for facial recognition accuracy. Add it to the list.
      • Comment removed based on user account deletion
      • by jdharm ( 1667825 )

        ...If you are looking for a suspect in a city of a million people, and this system flags 10 people, ...

        You're quite right, flagging 10 people out of a million would be pretty darn good.

        The problem is that the article said that it flagged 2470 people at the UEFA Champions League Final which had an attendance of just over 65k. The system flagged 3.8% of the population as suspect.

        So using your example of a city with a million people and assuming as you have that the rates hold no matter the population size, the system would flag 38,000 people, not ten. Of those 38k nearly 35k are completely innocent.

        In n

        • In no world could I represent those numbers as "pretty darn good".

          It's pretty darn good when you consider the fact that human beings couldn't possibly cross-reference even a tiny fraction of those 1 million faces against a database of tens or hubdreds of thousands of known criminals, whereas they can fairly easily check 38,000 specific matches picked up by a computer. I'd go beyond "pretty darn good"; it's downright revolutionary.

      • by eth1 ( 94901 )

        Rate of 8% successful, meaning almost 1 in 10 people are correctly identified. Not that bad.

        The thing is, if the false positive rate is that bad, I would argue that a "match" isn't good enough to constitute reasonable suspicion/probable cause for an arrest. If they stopped me for an "intervention", based on a system that bad, I'm either free to go, or under arrest. If they won't let me leave, I'm under arrest, and would consider filing a suit over it (mainly to make horribly inaccurate systems like this less attractive to LE).

    • by Anonymous Coward

      They are clever lads. They can figure something out.

      01001001 01100110 00100000 01111001 01101111 01110101 00100000 01100011 01101111 01101110 01110110 01100101 01110010 01110100 01100101 01100100 00100000 01110100 01101000 01101001 01110011 00100000 01110100 01101111 00100000 01110100 01100101 01111000 01110100 00101100 00100000 01110100 01101000 01100101 01101110 00100000 01111001 01101111 01110101 00100000 01100001 01110010 01100101 00100000 01100001 00100000 01100110 01100001 01100111 01100111 01101111 0

    • Rate of 8% successful, meaning almost 1 in 10 people are correctly identified. Not that bad.

      That is positively bad.

      • Rate of 8% successful, meaning almost 1 in 10 people are correctly identified. Not that bad.

        That is positively bad.

        That is bad, IF the police are in the habit of just shooting suspects. While there are cities in the USA that I would expect to do that sort of thing, I've never heard that the Brits are all that big on "shoot first, question the corpse"...

        On the other hand, if all they do is pass the pictures on to a human for follow-up (which follow-up does not include "shoot them, then ask questions"), t

        • Rate of 8% successful, meaning almost 1 in 10 people are correctly identified. Not that bad.

          That is positively bad.

          That is bad, IF the police are in the habit of just shooting suspects. While there are cities in the USA that I would expect to do that sort of thing, I've never heard that the Brits are all that big on "shoot first, question the corpse"...

          On the other hand, if all they do is pass the pictures on to a human for follow-up (which follow-up does not include "shoot them, then ask questions"), then it's not that big a deal.

          Doood! The whole thread is "let's be positive", so I wrote this is positively bad. You know, like a play on words? He said be positive, so I used it in a sentence. Positively. Wow, tough crowd tonight.

        • It's not just passing on a photo. People are temporarily detained and ID checked. Their response to this is 'theres been no complaints'. I guess the British have more tolerance for being stopped by the police, but I find that entirely unacceptable.
          • Comment removed based on user account deletion
            • by Wulf2k ( 4703573 )

              And what if you did miss your tram? Still not too bad?

              And your face is obviously one the system has a problem with, so what if you get stopped again the next day? And the one after?

              And each day you miss your tram, they check your id, and let you go home.

              No biggy, right?

              What if you get a little lippy the 5th time it happens?

              Have some proper respect for authority, will you? Because it seems you're standing 0.01 meters too close to the tracks, which is a fineable offense under subsection j paragraph 2 of th

  • by larryjoe ( 135075 ) on Monday May 07, 2018 @05:55PM (#56570372)

    despite the system having a 92-percent false positive rate, "no one" has ever been arrested due to such an error

    I may have concerns about the civil liberty impact of broad-net surveillance systems in general, but the algorithmic deficiencies of this particular system are portrayed incorrectly in this article. I.e., the front-end of the system (the facial recognition system) has a 92% false positive rate, but together with the post-processing in the back-end, the total system has a false-positive rate of 0%. This is similar to saying that the object detection failure probabilities for a ADAS system need to be viewed in the context of the entire system, and it's the performance of the total system that is significant.

    • Yeah let's say it's run in New York City with 8.5 million people. They put out a facial ID profile of a suspect and out 8.5 million people 13 possible suspects are returned. That's a lot better than stopping and questioning every "black guy around 6 feet tall who is wearing wearing shoes in a 2 mile radius".

    • The article is supposed to cast doubt on that 0% figure by suggesting that the police are lying about their claim of no one falsely arrested. That is why they put "no one" in quotes.
    • by Zocalo ( 252965 )
      Agreed; the part of the summary where it says "no individual has been arrested where a false positive alert has led to an intervention" implies that some of these false positives are resolved by an officer doing further checks, which might just be comparing the CCTV image with a mugshot and deciding it's not a match. Privacy issues of the surveillance state aside, as far as the member of the public that was incorrectly flagged is concerned, I guess that's no harm, no foul because they are none the wiser.
      • > how many of the 173 people that were arrested as a result of the system does the South Wales Police dept. think might have otherwise been overlooked in the crowds? If that's a significant fraction of those 173 arrests

        Based on my experience, I'd estimate that approximately zero would have just been noticed by a cop saying "hey that guy looks like someone who has a warrant". There are a LOT of wanted criminals, far too many to memorize each face. Very few fugitives are caught that way.

        They do tend to get

    • by jrumney ( 197329 )

      the total system has a false-positive rate of 0%

      A quick fact check on that assertion :

      of the 2,470 alerts from the facial recognition system, 2,297 were false positives... the SWP said that it has arrested "over 450" people as a result of its facial recognition efforts ...

      saying that despite the system having a 92-percent false positive rate, "no one" has ever been arrested due to such an error.

      I think someone is not being totally honest with the facts here, and experience tells me it might be the guy in uniform on a power trip.

      • by Muros ( 1167213 )

        92% of positive results being incorrect is not a 92% false positive rate. Without knowing the total number of images checked, the numbers in the article are worthless for making any judgement.

      • by Wulf2k ( 4703573 )

        To phrase this another way, since it seems your point was missed....

        450+ people were arrested due to facial recognition.

        2470 alerts were generated, and 2297 were false positives. This means 173 alerts were legitimate.

        This means 'at least' 277 arrests were based on false positives.

        • No, people are just mixing up numbers. The detection rate refers to natches made during one specific event (a football game) the arrests refer to a much longer period of time. How many of those arrests originated from data gathered at the actual game is not stated.

    • Much like some types of cancer tests. The false positive rate is very high, but it is the false negative rate that is the concern. i.e. don't want to give someone with cancer a clean bill of health. They follow up with more accurate (and expensive) tests for the final diagnosis. It make a lot of sense to give everyone a cheap test even if the FP rate is 90%, as long as the large numbers of negative results are accurate.
  • For police work, identifying suspects, false positives only affect the overhead portion - rejecting someone identified. If however it had a false negative, then it would be an issue as it would let people who should be suspects go away free. For the moment, as long as they aren't looking for too many people , false positives just allow them to remind the LEO fearing folk that there is law and order in the land.

    What is dangerous is that if the rate does not improve, and you have 10/% of their population doin

    • by HiThere ( 15173 )

      I basically agree, but I will note that you are making one false assumption. You're assuming that the false positives are uniformly distributed over the population.

      • by I4ko ( 695382 )

        Indeed, thanks for catching that. I was also assuming that the criminals are distributed equally between all facial phenotypes

        • by HiThere ( 15173 )

          Well, not precisely. Since we're going into analysis, what's being assumed is that convicted criminals are evenly distributed. There's a lot of evidence that indicates that the laws are unevenly enforced against groups based on phenotype. (Which phenotypes are significant varies.)

          So this could be another self-fulfilling prophecy.

    • Because of the number of false convictions, it seems unlikely in the extreme that such false positives have led to an erroneous conviction. I'm afraid that conviction for drug charges, with mandatory sentencing, has been particularly problematic in the USA for decades. There is also a strong racial trend towards convicting black men, innocent black men, of drug crimes. I'm afraid this will be exacerbated by facial recognition systems that do not differentiate well among black people's faces.

  • but still being used for the same purpose: to justify an illegal fishing expedition.
    • I'm mystified. Which law did they break?
      • He seems to think that investigating is illegal. I'm not sure why.

  • we have built in checks and balances into our methodology

    "We"? Don't checks and balances typically require outside stakeholders to be directly involved?

    • While outside agencies are a plus, it would not be necessary as long as you come up with appropriate cross incentives. Similar to a business requiring multiple approvers in different chains of command to spend certain amounts or change policies.
  • Bad maths or fishing (Score:3, Interesting)

    by Anonymous Coward on Monday May 07, 2018 @06:13PM (#56570470)

    2,470 alerts - 2,297 false positives = 173 true positives.
    >450 people arrested from "facial recognition efforts".

    Either that means there were >277 false arrests due to facial recognition, or they are counting arrests due to "facial recognition efforts" as also including the results of things they found when the searched people based on those false positives.

    Since they claim "no one has ever been arrested due to such an error", so this means that both that the number of successful arrests has been inflated to make the system look more useful, and that the system's primary function is to justify illegal searches.

  • are they being buried in data and information? And by the time they sort through it all the intelligence maybe meaningless. Having to much chaff mixed in with the grain your looking for can be a bigger problem overall.

    Just my 2 cents ;)
  • Here is a proposal for a simplified facial recognition algorithm. It features 0% false negative, at the price of an acceptable false positive rate

    bool is_suspect(char *picture, size_t picture_len) { return true; }

  • "That's much better than officers!"

  • by GrumpySteen ( 1250194 ) on Monday May 07, 2018 @06:43PM (#56570612)

    Catching criminals is a side effect. The main purpose is to create justification to investigate anyone they want.

    • Current system: supervisor tells subordinates "We got an anonymous tip that GrumpySteen had sex with a goat. Go investigate him."

      New system: supervisor tells subordinates "The computer says GrumpySteen matches the photo of a guy who had sex with a goat. Go investigate him."

      What's the difference, exactly? Have you actually thought this through? Or is it just your knee-jerk reaction to scream "uhrmaghurd conspiracy" every time someone develops some new technology?

      • IMO, the difference has nothing to do with the face recognition system, the difference is all about the citywide police CCTV network which enables everyone out in public (or at home insufficiently-closed blinds) to be monitored and recorded all the time.
    • Before you can make that claim you need to show where they said they investigated any of the false positives, as opposed to: "Huh? Nah that's not him! Keep moving."

    • Catching criminals is a side effect. The main purpose is to create justification to investigate anyone they want.

      No, the main purpose is to catch criminals. The means to that end is creating justification to investigate anyone they want.

      There's no reason to assume ill intent here. The police really do just want the best possible tools to do their jobs with maximal effectiveness. It's rarely in the interest of society as a whole to give police everything they want, but that doesn't mean the police are wrong to want it.

      Completely hamstringing the police and allowing crime to run unopposed is bad for society. Giving

      • IMO, having CCTV cameras everywhere is a step too far.

        I agree with pretty much the entirety of your very well thought out comment, but I'm curious on what basis you've formed this particular opinion. I myself am rather undecided on how I feel about CCTV surveillance, and to what extent it is or isn't acceptable ... so I would appreciate hearing your thoughts on the matter.

        • IMO, having CCTV cameras everywhere is a step too far.

          I agree with pretty much the entirety of your very well thought out comment, but I'm curious on what basis you've formed this particular opinion. I myself am rather undecided on how I feel about CCTV surveillance, and to what extent it is or isn't acceptable ... so I would appreciate hearing your thoughts on the matter.

          Heh. I wish I had something crisp, clear and well thought out to say on it, but I don't. It just seems like it gives government too much information, too much scope for abuse. Even though there are lots of cameras recording public places in every city, it seems better that the data is fragmented and hard to assemble except upon demonstrable need. Failing that, I'd want to see a very strong anti-abuse infrastructure put in place.

  • Another system in current use for doing similar police work is to make public calls for information that might be helpful to a case or broadcast sketches or grainy videos of suspects and ask for the public to call in. What percentage of those calls are false positives? My bet is it is vastly higher than 92%.

  • I'm more interested in their claim that no one has been arrested due to a false positive. That's nearly impossible to completely avoid without the use of a facial recognition system. Has the UK found a new system that allows them to only arrest guilty people without the need for a costly legal system?

    • by Wulf2k ( 4703573 )

      "Arrest" is specific procedure.

      You can "detain" somebody and go through all their stuff without officially "arrest"ing them.

      And if you find a little bag of sunshine during their detainment and upgrade it to a full arrest, well, that's how justice works these days, isn't it?

  • The way investigative policing works is you have numerous leads, and you follow up on them, and most end up asv dead ends, but hopefully some bear fruit.

    A false positive is a lead that didn't work out.

    • by Wulf2k ( 4703573 )

      And it's remarkably easy to turn a false positive into a true positive if they happen to be carrying anything illegal when you randomly search them.

      The system's so amazing that you can find people you didn't even know you were looking for.

  • by jma05 ( 897351 ) on Monday May 07, 2018 @10:13PM (#56571330)

    A percentage, without the context of use, is meaningless.

    They might be using them for screening, to focus human evaluation. If so, that means that it is ultimately the cop that makes the decision, not the system. This is how today's AI is meant to be used - as a cognitive aid.

    It is fairly common for screening tests in medicine to have high false positive rates. That is OK. They are just meant to narrow down the search space for more expensive/invasive confirmatory tests. Given that the incidence of criminal targets will always be a tiny percent of the corpus, it is very difficult to have tests with high true positive rate. That is quite normal for general tests, in general.

    The questions that are relevant are:

    1. Are the police able to better solve crime with the aids?
    2. Is the test too expensive for the said improvement?
    3. What are the rates of negative outcomes (like a wrongful arrest) and..
    4. What do we, as a society, consider to be acceptable thresholds?

  • 92% false positives is not a problem IF they require all matches to be checked by a human being for taking any action against the person matched. And don't have the AI matches confirmed by somebody with face blindness (prosopagnosia) like me, either!
  • This is exactly using technology for something it is completely unsuited.

    Facial recognition is useful as second or third-factor authentication of a small and clearly defined user base. Like checking the face of a person wanting to pass a security door whilst the same person is in possession of a RFID badge. Not only do you match against a smallish set of people who "shall pass", but against the very small set of people who may pass with that specific RFID badge, exactly one, that is. And in this case, secur

    • This is exactly using technology for something it is completely unsuited.

      That's hilarious. To me it seems like they're using it for something it's incredibly well suited to.

      It scanned 65,000 people in a public place. Of those it flagged some 2,500 for further assessment. Human officers, who were monitoring the crowd already, then examined the matches and discarded ones which were clearly wrong. The rest were investigated further.

      How exactly is that using technology for something it's completely unsuited? Do you honestly think it would be better to have hundreds of cops st

      • Well I for one don't think mass facial recognition should be used at all, because anyone who thinks it won't be abused just hasn't been paying attention.
        • Any tool we invent will be abused. You may as well say "I for one don't think fire should be used at all, because anyone who thinks it won't be abused just hasn't been paying attention".

          If you're thinking of it in those terms then your opinion on the subject is irrelevant. The question isn't whether it will be abused; the question is whether the benefits outweigh the drawbacks.

  • by sad_ ( 7868 )

    just like a monitoring systems that beeps every few seconds and alerts on eveything which is in some cases important, it will be ignored by its users.
    in that case, why keep it running, just turn it off.

"All the people are so happy now, their heads are caving in. I'm glad they are a snowman with protective rubber skin" -- They Might Be Giants

Working...