Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Education Software

Proctorio Is Using Racist Algorithms To Detect Faces (vice.com) 366

An anonymous reader quotes a report from Motherboard: Students of color have long complained that the facial detection algorithms Proctorio and other exam surveillance companies use fail to recognize their faces, making it difficult if not impossible to take high-stakes tests. Now, a software researcher, who also happens to be a college student at a school that uses Proctorio, says he can prove the Proctorio software is using a facial detection model that fails to recognize Black faces more than 50 percent of the time. Akash Satheesan, the researcher, recently published his findings in a series of blog posts. In them, he describes how he analyzed the code behind Proctorio's extension for the Chrome web browser and found that the file names associated with the tool's facial detection function were identical to those published by OpenCV, an open-source computer vision software library. Satheesan demonstrated for Motherboard that the facial detection algorithms embedded in Proctorio's tool performed identically to the OpenCV models when tested on the same set of faces. Motherboard also consulted a security researcher who validated Satheesan's findings and was able to recreate his analysis. [...]

Satheesan tested the models against images containing nearly 11,000 faces from the FairFaces dataset, a library of images curated to contain labeled images representative of multiple ethnicities and races. The models failed to detect faces in images labeled as including Black faces 57 percent of the time. Some of the failures were glaring: the algorithms detected a white face, but not a Black face posed in a near-identical position, in the same image. The pass rates for other groups were better, but still far from state-of-the-art. The models Satheesan tested failed to detect faces in 41 percent of images containing Middle Eastern faces, 40 percent of those containing white faces, 37 percent containing East Asian faces, 35 percent containing Southeast Asian or Indian faces, and 33 percent containing Latinx faces.

This discussion has been archived. No new comments can be posted.

Proctorio Is Using Racist Algorithms To Detect Faces

Comments Filter:
  • by Anonymous Coward on Thursday April 08, 2021 @07:06PM (#61253260)

    *keeps scrolling*

    • by niftydude ( 1745144 ) on Thursday April 08, 2021 @08:31PM (#61253486)
      Yup. Hollywood has known you need different lighting to best film black and white actors since color film was invented over a century ago.

      But now if students sitting at home don't light their face appropriately, the company trying to film them is the racist, or its algorithms are...

      Cool story Beau.
      • by sjames ( 1099 ) on Thursday April 08, 2021 @08:40PM (#61253498) Homepage Journal

        Racist might not be quite the right term since the word implies at least some malice which may not be present. It is, however, fair to say that it's use disadvantages black people. Continuing to use it now that the failure has been pointed out might fall under racist.

        Of course, with the failure rates for all ethnicities, it's fair to say it's unfit for purpose.

        • None of the exam proctoring software whipped up for the covid pandemic actually works. The closest is the one that has a human invigilator watching video streams of the students sitting the exam. But that is ridiculously expensive since it needs an invigilator for every five students or something.

          Problem is Universities don't actually have any other options.
          • by sjames ( 1099 )

            Why can't they just have a couple TAs watch videos? That's good enough for in-person.

    • by gl4ss ( 559668 ) on Thursday April 08, 2021 @09:39PM (#61253644) Homepage Journal

      Its just harder to detect. Less contrast in the usual photoed range. Its not intentional and its not racism.

      Solutions? Shoot the photos with settings that bring the faces near more to center values. But thats also racist...

      Anyway the whole articles written as an attack at a company, while actually its just an observation of opencv's facial features functionality and default models available for it(which kinda suck)

      • by XXongo ( 3986865 )

        Its just harder to detect. Less contrast in the usual photoed range. Its not intentional and its not racism.

        Just because it's not intentional does not mean it's not racist. This is systematic use of an alogrithm which disadvantages people solely on the basis of race. That's racism. By definition.

        A lot of racism can be unintentional. When the programmers said "we will train this algorithm using photos of white people (and by implication "it's not necessary to train it on black people, because black people don't matter") that's unintentional racism. When the company said "this software meets minimal standards wh

        • by otuz ( 85014 )

          So, you're also saying that the people who made the webcams in the computers implemented settings that were optimized for white faces, and didn't care-- probably didn't even bother checking-- whether the settings work on black faces.

          No, webcams are just really shitty little cameras and for any faces you need proper lighting, and faces with less contrast need even more. If you read the numbers, it's also racist against white people and in general the algorithm fails on roughly half the faces, racial differences are like rounding errors.

        • Re: (Score:3, Insightful)

          by narcc ( 412956 )

          Most racism is unintentional. I suspect that's why it's been so damn hard to stomp out.

          This idea that all racism must be overt and intentional seems to exist only in this thread.

  • by SirSpanksALot ( 7630868 ) on Thursday April 08, 2021 @07:09PM (#61253266)
    Proctorio's camera requirement is a gross invasion of student privacy, and is completely ineffective at preventing cheating anyway.
    • by AmiMoJo ( 196126 )

      It's time to ditch exams and move to coursework. Real work is like coursework, you don't have to sit at your desk with no books or internet access and try to figure out your job.

  • Darker faces? (Score:4, Informative)

    by Anonymous Coward on Thursday April 08, 2021 @07:13PM (#61253270)
    Darker colors reflect less light, so any detector Is going to get less data from a dark face. I guess physics is racist.
    • I'm curious how a redneck with dark suntan would do.

      • a redneck IS the dark suntan ya dork.

        • Actually, the origin of the term has to do with red bandanna around the neck, of Presbyterian Covenanters of the 17th century. Descendants of these became Appalachian "hillbillies".

          So I say let's put some sun-darkened white trash cracker honkies in front of the camera and see if the algorithm ignores them.

    • Though darker faces have a much higher contrast against a light background (most walls) so it's not obvious that they're a harder classification problem.

      Plus, you have the fact that most webcams were designed with lighter complexions in mind so again, that's hard to say how much is physics vs bias.

      I think there's only two things we can be certain of.

      1) Blacks students encounter greater difficulties using Proctorio than students of other ethnicity.

      2) Slashdot is apparently using flamebait headlines now.

      • by malkavian ( 9512 )

        In outline yes, in actual contrast across features, no. That's a known problem in automatic detection, and not a trivial one to fix; is the data values to look for are far less contrasted, there's a much greater error bar in a match/not match decision. And features are what you're looking for, not the outline, so yes, it is obvious it's a much harder classification problem, and it's also obvious you don't work in the field.
        Also, webcams aren't designed with ANY complexion in mind. They're simple light sen

        • by Entrope ( 68843 )

          This. However, in defense of the earlier comment, the two things alleged to be certain are not affected by those errors, and AFAIK both are true.

        • Re:Darker faces? (Score:5, Insightful)

          by quantaman ( 517394 ) on Thursday April 08, 2021 @09:40PM (#61253650)

          In outline yes, in actual contrast across features, no. That's a known problem in automatic detection, and not a trivial one to fix; is the data values to look for are far less contrasted, there's a much greater error bar in a match/not match decision. And features are what you're looking for, not the outline, so yes, it is obvious it's a much harder classification problem, and it's also obvious you don't work in the field.

          I don't work in the field but I do have some familiarity with image recognition. Researchers developed solutions for the problems they targeted, detection of light skinned faces. If those techniques don't work as well for a different problem (dark skinned faces) the automatic assumption isn't that it's an insurmountable problem of physics.

          Also, webcams aren't designed with ANY complexion in mind. They're simple light sensors (that's the fact).

          For someone who just implied they work in image recognition you might want to have a closer look at how your images are being generated. The problem of cameras/webcams being designed and calibrated for light complexions is well know [hackernoon.com].

          Now, it's perfectly plausible that if things were reversed and dark-skinned Africans were running the world economy and white Europeans were generally poor and uneducated we'd have the same problem. But if that were the case I suspect facial recognition would work much better on black people and people would be talking about the difficulty of detecting washed out white faces.

  • Some of the problem is in the technology, and some of it is in its application. Either way, one can consider a wide field of solutions [youtu.be].
  • by Riceballsan ( 816702 ) on Thursday April 08, 2021 @07:14PM (#61253282)

    I mean I know the term in this context is a joke. Being unable to deal with dark colors is a typical difficulty, one that the only good solutions for require different types of cameras to work around. Why not just say, the software sucks at recognizing faces. It's not like the programmers are failing because they don't like black people, it's because it is harder to teach a computer to spot the difference between shadow and skin when the contrast is lesser.

    all that being said... Proctorio should aknowledge the problem in the software with the fact that a 1% chance of failing someone who did everything right is way too high, let alone what would come out to be at least 10%. That looks like a job the AI is certainly not ready to work in.

    • Re: (Score:3, Funny)

      by matthewd ( 59896 )

      The headlines clearly says the algorithm is racist, not the programmers who wrote it.
      I suspect that that the algorithm was raised in Jim Crow south and it is therefore a product of its environment. Clearly it needs to attend some diversity training sessions and learn to counteract it's inherent white privilege.

    • Re: (Score:2, Insightful)

      by StormReaver ( 59959 )

      Why not just say, the software sucks at recognizing faces[?]

      Because that wouldn't allow for the disingenuous virtue signaling that has been running rampant for the last several years.

    • Re: (Score:3, Insightful)

      by jeff4747 ( 256583 )

      Intent isn't required for something to be racist.

      • by tmmagee ( 1475877 ) on Thursday April 08, 2021 @10:50PM (#61253760)

        But the word racist is so loaded at this point and used across so many contexts that I would argue it implies intent. A white supremacist is a racist. A bad algorithm that fails for people with dark skin is racist. Do you really want to put those two things in the same basket? This is buggy software that doesn't work when used with people with dark skin, which has led to discriminatory outcome when used for exam surveillance. I think that is a more accurate and informative assessment, and is lot less loaded, than simply calling it a "racist algorithm."

        On a related note: when I was a software developer, most companies I worked for did not think of people with disabilities when they developed software. This is a little off-topic, but in a thread like this it deserves mentioning, because I think software development more often than not leads to discriminatory outcomes for people with disabilities.

    • It's not like the programmers are failing because they don't like black people, it's because it is harder to teach a computer to spot the difference between shadow and skin when the contrast is lesser.

      Aha! You are now racist by association, for defending accused racists with facts and logic! Burn the witch!

    • it's that it's being used by police to arrest people. [youtube.com] Or at least some of these facial recog systems are.
      • yeah, across the board it's, algorythms that aren't accurate or ready to be more than a novelty, are being put in roles where they can really screw up peoples lives.
  • Including made-up nonexistent ethnicities like LatinX and SpaceX and TampaX

    Marketing surveys show that the X makes people want to buy more.

    • X gone give it to ya

      RIP dark man X

    • I've actually seen the phrase "Latinx women" used in a print magazine. I don't know who they're trying to avoid offending, because Spanish already had the word "Latina". I thought trying to learn the language was respectful. Certainly in this case, it's easier than making up some tortured structure that doesn't fit into either language. How do you pronounce it, La-Tinks?

      Another weird one getting thrown around is "Bipoc"... I'd be very interested in someone trying to explain that one, because I'd like to get

      • It's funny in the gallows humor sense.

        One set of people insist absolutely on saying "latina" in the Spanish way with the un-aspirated t in there.

        A second group will flip a shit and scream "cultural appropriation" if a non-hispanic person says it in the Spanish way.

        And yet a third will insist that both white and hispanic people speaking English say "latinX" in this bastardized combination with the accent on the second syllable, in the Spanish style, but with the aspirated t in the English style.

        The war of th

  • An algorithm that performs poorly with dark skin isn't racist, what nonsense. We need to come up with a good slur for submitter BeauHD, how about "woketard"?

    • Re: (Score:3, Insightful)

      by narcc ( 412956 )

      Why isn't it racist? Do you think that racism requires intent? Why?

      Overtly racist people, active members of white supremacist groups, claim that they're not racists. I have to wonder if racism is *ever* intentional.

  • Wha...? (Score:5, Informative)

    by ArmoredDragon ( 3450605 ) on Thursday April 08, 2021 @07:19PM (#61253300)

    The algorithm is racist? Leaving aside that algorithms can't be any more racist than a piece of dog shit, I kind of doubt the people behind the open source project deliberately built it that way. Darker skin faces have inherently less contrast to work with, thus facial features are inevitably going to be harder to detect.

    Nobody decided to make it that way, that's just how it is. Even commercially sold facial recognition software has this problem. Only a moron would attribute this to malice.

    • Re: (Score:3, Insightful)

      by gurps_npc ( 621217 )

      Not as malice but as Gross Negligence. I.E. the difference between Murder and Manslaughter.

      It is not the nature of black skin, but the total lack of training on black skin.

      This is caused by morons that train the software on their own face and the face of the people employed by their company. The people working at their company are all white for some reason. What could that be, you ask?

      Racism. Racism is why they had not a single freaking black person in their company, so when the racists started their

      • > Racism. Racism is why they had not a single freaking black person in their company, so when the racists started their company using racist hiring practices, that racism got incorporated into the algorithm.

        TFW you can't distinguish prime-quality satire from woke rantings.

      • Re:Wha...? (Score:4, Insightful)

        by Entrope ( 68843 ) on Thursday April 08, 2021 @08:17PM (#61253458) Homepage

        It takes a bigot to simply assume they (a) only trained on employees at their company, (b) their employees are "all white", and (c) the reason for that is racism.

        And it takes a really moronic bigot to assert those things when the TFA says -- not even a third of the way down -- that Proctorio apparently uses nothing more than stock OpenCV models for face detection.

        They do deserve to be skewered, but for using totally off-the-shelf models and claiming to have secret sauce rather than for active racism.

      • Comment removed based on user account deletion
      • So... the white people programmed an algorithm that was best at detecting Latinos? Really? Jesus you're fucking stupid.

    • Intent is not required for something to be racist.

      Do you think a racist school segregation policy from the 1950s intended to do anything? It's a piece of paper.

      • by presidenteloco ( 659168 ) on Thursday April 08, 2021 @09:45PM (#61253668)
        The USE of the the school segregation policy was racist BECAUSE the policy was INTENDED to keep races separated, because some people (white people with power, primarily) didn't like them mixing, which is a racist attitude.

        "Racism" or "racist" does imply intent. It is a pejorative term and much scorn and punishment is due to those accused of it, so it should not be used lightly or inaccurately.

        "Racially disadvantaging" or "racially inequitable" or similar would be more appropriate terms for non-sentient artifacts/systems used without malintent but having the inequity-causing effect.
        • Your attempt to shrink the definition of racism is an attempt to not deal with the enormous number of problems we have that do not have direct malice behind them.

          Just because the software developers here didn't mean it doesn't mean that this software does not cause a great deal of harm. Kids are failing tests because the software is racist. Tacking on more and more words to try and soften the blow to you does way more harm.

    • if it's built off A.I. and the data set has a bias in it then the algorithm will reflect that bias. Worse you get the "divining rod" effect, where cops will use software like this to establish probable cause for arrests/warrents/searches and since these systems are likely to flag any person with dark skin (they literally all look alike to a bad algorithm) it's ripe for abuse by police departments looking to get another arrest and pad their numbers.

      TL;DR; Garbage in Garbage out.
  • Culture is not a race.

    Ethnicity is not a race.

    Language is not a race.

    Religion is not a race.

    Nothing is a race when it comes to humans. There is just ONE human race.

    When will the USA stop exporting this unscientific ridiculous divisive propaganda on the world?

    • Re: (Score:3, Informative)

      by iggymanz ( 596061 )

      Wrong. There is indeed a definition of "race", look it up. There are several human races. They each have distinctive characteristics cause by tens of thousands of years of isolation without interbreeding. There are medical conditions unique to each race, and plenty of peer reviewed biology and medical texts affirm this.

      Quit "virtue signalling" by spewing this wrong bullshit. The real world doesn't function according to your woke bullshit.

      • If you think race comes from genes, you don't understand race.

        • Race is a defined term. You are the one who understands nothing

          You think you know more than the scientists at the CDC who say race is a risk marker for covid19? Or the economists who say that race is a risk marker for poverty? No, you don't. You are just spewing some nonsense that sounds good to you. You are ignorant.

      • about 12,000. That's when we as a species started to see a bit of genetic divergence from our African ancestors. The differences you're speaking of can often be attributed to environmental factors and selection pressures.

        e.g. the ability to digest milk that the Proud Boys are so proud of is likely due to milk (specifically preserved cheese) being the only available food for a few dozen generations and selection pressures kicking in. Do the same thing to China and watch them eat more cheese.

        I guess m
        • Adult milk drinking/lactase persistence is a great example of the gene flow effects [wikipedia.org] that I was talking about a couple of comments up. It evolved at least five times, and each variant diffused into neighbouring populations, gradually getting less common as you get further from the origin. One variant flowed between northern Europe and Central Asia, and gets less common as you go south in Europe. Another variant flowed between the Middle East and eastern Africa. Another variant flowed between eastern Afri

      • by Bert64 ( 520050 )

        So are those racially-unique medical conditions racist? Since they actively disadvantage one race more than another, they would seem to fit the definition being used here.

      • They each have distinctive characteristics cause by tens of thousands of years of isolation without interbreeding.

        This is factually wrong. I have no idea how you got upvoted as "Informative". Example one: Africans carry surprising amount of Neanderthal DNA [sciencemag.org]. Earlier research missed this because they assumed that migration was out-of-Africa only, and applied the assumption of zero Neanderthal DNA to determine European Neanderthal DNA percentages. This led to a mystery: Why did Asians have more Neanderthal DNA than Europeans did? The mystery was resolved when they compared African genomes directly to Neanderthal geno

      • Example 3: A map of lactase persistence [wikipedia.org].

        Example 4: The geography of sickle cell disease [annsaudimed.net]: "The sickle cell gene is now known to be widespread, reaching its highest incidence in equatorial Africa, but occurring also in parts of Sicily and Southern Italy, Northern Greece, Southern Turkey, the Middle East, Saudi Arabia, especially the Eastern Province, and much of Central India."

        Example 5: Distribution of blood types [palomar.edu].

        Notice how genetic variants find their way around. Notice how their prevalence gradually

  • by Anonymous Coward on Thursday April 08, 2021 @07:22PM (#61253316)

    I used to work at a place that made, among other things, porn filters. The first rev looked for "flesh tones" and yep, Black and Asian porn got through it.

    I forget if they ever got to the point of being sophisticated enough to weed out gray scale porn, but they fixed the races. There was nothing special AFAIK. I think it was just a huge Bayesian, and yes I did get to look at some porn at work. No, it wasn't fun. After Slutty Jane pops up in the test results a few hundred times, it's just business.

  • No matter your race, if you have albinism then it's going to detect you. The problem with the algorithm is that it was not tested on a wide enough selection of pigment color, which is not race specific. Claiming it's racist is disingenuous because the software is literally incapable of detecting race. It would be best described to say the software is flawed in that it has difficulty detecting faces with dark features.

    This isn't the first time this has happened in face recognition software and it's not go

    • by malkavian ( 9512 )

      Wouldn't even say the software is flawed. It works as well as it can. It's just a much harder problem to detect a lower contrast (errors creep up massively).
      I suspect the answer would lie in using a specific spectrum other than visible light to match against (i.e. infrared or some other band), but that'd need far more expensive hardware.

  • If the schools are using inadequate software, call it what it is: broken software.

    Stop using visual spectrum cameras, go to infrared which doesn't need high contrast surroundings (or something something).

    Darker objects are harder to detect unless every camera is in a very well lit environment. If I get sunburn and someone else doesn't get sunburn in the same conditions, is the Sun racist?
    • is the Sun racist

      Perhaps. Western conceptions of astronomy are suspect. "Colonial science" they call it. https://m.thewire.in/article/t... [thewire.in]

      Arithmetic is probably racist. https://www.theatlantic.com/ed... [theatlantic.com]

      Newton's laws are racist. https://www.city-journal.org/t... [city-journal.org]

      I suppose it's hard to argue that gravity is racist. I mean, it Literally Pulls The Black Man Down.

    • Stop using visual spectrum cameras, go to infrared which doesn't need high contrast surroundings (or something something).

      FaceID seems to work quite well regardless of skin tone, so that might work. But there are few home computer environments equipped with the proper hardware for this, so the cost would be pretty high. Perhaps the schools could provide the equipment, but there are a lot of logistics issues to overcome before it is widely available.

  • by bjdevil66 ( 583941 ) on Thursday April 08, 2021 @07:28PM (#61253334)

    Racism requires the belief that one race is better than others. Can an algorithm believe something?

    Forgetting that, just compare the recognition failure rates:

    Black faces: 57%
    Middle Eastern: 41%
    White: 40%
    East Asian (oriental): 37%
    Southeast Asian or Indian: 35%
    Latinx: 33%

    Either the programmer's name was some racist named Juan Rodriguez, or the algorithm just kinda sucks with really dark or really light skin and needs more work.

    • Racism requires the belief that one race is better than others.

      Which definition are you using? Please post a link.

    • No, racism simply requires that you treat different races differently.

      Definition 2a: https://www.merriam-webster.co... [merriam-webster.com]

      the systemic oppression of a racial group to the social, economic, and political advantage of another

      Nothing about intent, only effect.

      So does this cheating detector systematically oppress some racial groups? Absolutely - it makes it much harder for some non-white races to pass surveillance to take their exam.

      Does doing so benefit another racial group? If the class grades on a curve, definitely (and they almost all do) - all the people who had no problems with facial recognition have th

      • ...from that exact same page [merriam-webster.com]:

        "Definition of racism
        1: a belief that race is a fundamental determinant of human traits and capacities and that racial differences produce an inherent superiority of a particular race."

        That omission of #1 wasn't on purpose, was it?

        • Someone needs a lesson in deductive reasoning.

          You stated that racism "requires" belief, then posted a link to a page with a definition that contradicts you.

          Quick lesson: just because A implies X, that doesn't mean (A or B) implies X. Those are different things.

      • There's nothing "systemic" about classifier failures other than the fact that our classifiers are imperfect. But even if you want to label failures of classifiers as systemic doesn't mean that they only fail to identify people. That would be like labeling an extinction-level asteroid impact as "racist" because it wipes out all black people in the world. (Yeah, with everything else in it! Duh.)
      • Allowing for the sake of argument the inclusion of the definition of "systemic racism" under the entry for "racism" (the disingenuous sockpuppetry of asking for a change in the dictionary and then using that definition to back your argument as if it were independent confirmation notwithstanding), an inanimate object cannot oppress you. The man wielding it certainly can, but the object itself has no moral worth.

        A gun can threaten or defend.

        A knife can cut off a foot for trying to escape or cut up a roast ser

      • Yes yes, we know the woke crazies got the people at MW to change the definition to cater to their personal brand of racism.

      • by Bert64 ( 520050 )

        it makes it much harder for some non-white races to pass surveillance to take their exam.

        Why are you calling out "non white" ? Based on the results in the article, white faces are middle of the road. It actually makes it harder for whites to pass the surveillance than asian or latin faces.
        It actually makes it EASIER for some non-white races to pass surveillance to take their exam.

        Why is this being framed as white vs black racism, why isn't it being framed as latin vs black racism?

        Black faces: 57%
        Middle Eastern: 41%
        White: 40%
        East Asian (oriental): 37%
        Southeast Asian or Indian: 35%
        Latinx: 33%

    • by malkavian ( 9512 )

      XKCD had a lot to say about the redefinition of words. https://xkcd.com/1860/ [xkcd.com]
      I always think that when someone calls someone else racist because they don't like tapioca or something equally inane (which the majority of uses are these days).

    • Racism requires the belief that one race is better than others. Can an algorithm believe something?

      Forgetting that, just compare the recognition failure rates:

      Black faces: 57% Middle Eastern: 41% White: 40% East Asian (oriental): 37% Southeast Asian or Indian: 35% Latinx: 33%

      Either the programmer's name was some racist named Juan Rodriguez, or the algorithm just kinda sucks with really dark or really light skin and needs more work.

      Yep. Exactly. It's no more "racist" than the average IQ differences documented in The Bell Curve. which, believe it or not, was not authored by "racist" Mandarin Chinese or Ashkenazi Jews.

      Facts are not "racist", because they are not beliefs.

    • It really doesn't (Score:5, Informative)

      by rsilvergun ( 571051 ) on Thursday April 08, 2021 @09:32PM (#61253622)
      not when it's systemic. In this case it's literally systemic. The system has a higher failure rate. Most of these systems err on the side of giving a match because that's what the users want. Meaning they're more likely to give a false positive than a false negative. This in turn establishes probably cause allowing cops to conduct searches/arrests/etc.

      This is basically high tech Broken Windows policing. Do some googling and you'll find the history of that is completely tied up with harassing minority neighborhoods to keep them in their place. It's the same deal.
  • Good (Score:4, Interesting)

    by reanjr ( 588767 ) on Thursday April 08, 2021 @08:29PM (#61253484) Homepage

    Good. Now we can sue schools to stop them from using this shitty software.

  • by joe_frisch ( 1366229 ) on Thursday April 08, 2021 @08:57PM (#61253526)
    Code that fails to reliably recognize dark skinned faces is not inherently racist. Failing to test the code before release against a wide variety of people, could easily be racist. How did a 50% failure rate pass any sort of quality control - unless the company chose not to include and dark skinned people in their testing. If it were 1% failure rather than 0.1%, I could see that slipping by, but 50%? You don't need a lot of test samples to catch a 50% error rate.
  • by lordmule ( 857385 ) on Thursday April 08, 2021 @09:41PM (#61253656)
    A commercial software company is billing many academic institutions using open source software and may not be properly acknowledging or contributing royalties to the project and its authors.

    The original author had made a controversial blog post, but their sentiment was aimed at obfuscation, lies, and deceit about the technologies being proprietary, and being approved for use by those instutitions. Their post summarises with Audits audits...

    The motherboard website is spinning the original post, but they did some useful work:
    "On its website, Proctorio claims that it uses “proprietary facial detection” technology. It also says that it uses OpenCV products, although not which products or what for. When Motherboard asked the company whether it uses OpenCV’s models for facial recognition, Meredith Shadle, a spokesperson for Proctorio, did not answer directly. Instead, she sent a link to Proctorio’s licenses page, which includes a license for OpenCV."
  • by Bert64 ( 520050 ) <.moc.eeznerif.todhsals. .ta. .treb.> on Thursday April 08, 2021 @10:15PM (#61253704) Homepage

    Why the unnecessary hyperbole calling the algorithm racist.
    Racism would imply that it actually *does* detect a face of a specific race and then actively chooses to ignore it.
    If it simply fails to detect a face, then how is it being racist? In order to be racist you have to actually identify that a subject is present, then identify the race of said subject, then make an active decision to treat that subject differently depending on the identified race.
    This algorithm has not even identified the presence of a subject!

    It's not racist, it's a flawed algorithm, or supplied with flawed training data. There is a very important distinction to make.

    "The models Satheesan tested failed to detect faces in 41 percent of images containing Middle Eastern faces, 40 percent of those containing white faces, 37 percent containing East Asian faces, 35 percent containing Southeast Asian or Indian faces, and 33 percent containing Latinx faces."

    Based on this information, if you're accusing the algorithm of racism then you're accusing it of being a latinx supremacist algorithm, since it achieves inferior results for everyone else.

  • Amazing, if they'd had that when I was going to school I totally would have enrolled with colorology as my major and also been a student of color. To my facile understanding, sure, it would be easy - like we'd talk about what happens when you mix red and blue, and like what colors are found in nature and which are never found in nature. But I bet there's some really fancy physics involved too, and probably some cool chemistry experiments involved in making different colors and shit.
  • by MSTCrow5429 ( 642744 ) on Thursday April 08, 2021 @11:18PM (#61253800)

    And it really, really, really didn't like black people.

  • Racist (Score:3, Insightful)

    by peppepz ( 1311345 ) on Friday April 09, 2021 @03:35AM (#61254190)
    Woke professionals are "extending" the term "racist" so much that they'll end up depriving it of any meaning. And this will go to the detriment of real victims of racism.
    They should try running the same algorithm against a set of anglo-saxon people wearing glasses. They'll find out that the algorithm is "racist" toward them too, thereby showing that the people who developed the algorithm, presumably nerds, are a self-hating group.
    • Eliminating distinctions is the goal of followers of the Frankfurt School of philosophy, renamed "Critical Theory", which claims that all distinctions are imposed by those in power to denigrate and control the masses and that no amount of evidence can prove a theory, since all evidence is collected based on the oppressive political goals of the researcher.

      It makes it difficult when science seeks to measure or predict based on distinctions, including race, age, gender, health, size, body shape, or attractive

      • You left out the stark absence of self-reflection! That is, how they are critical of everyone but themselves - which makes sense given that the entire project is about accumulating and concentrating their own power, in order to denigrate and control everyone else. You really can't be honest or un-hypocritical if you're promoting Marxism.
  • by Dirk Becher ( 1061828 ) on Friday April 09, 2021 @06:27AM (#61254450)

    Thank you, thank you, I'll be here all evening!

  • by Vapula ( 14703 ) on Friday April 09, 2021 @07:45AM (#61254606)

    Basically, if algorithms have some troubles identifying black people, it's because their pictures lacks many details...

    The picture is based on the light received from the light reflected by the photographied subject. The darker the subject, the less light is reflected.

    So basically, we have

    Face recognition is racist because it fails to identify black faces
    Algorithms fails to identify black faces because of poor quality pictures (lacks of details
    Picture is of poor quality because there is not enough light reflection from the face

    Hence Light is racist...

  • by groobly ( 6155920 ) on Friday April 09, 2021 @11:06AM (#61255506)

    It's not just the algorithms. It's the very hardware. The hardware is the happy enabler of the racist algorithm. If it weren't for the hardware, the algorithms couldn't practice their racism. The hardware is just as culpable, for it is not an anti-racism ally.

  • by Oligonicella ( 659917 ) on Friday April 09, 2021 @11:20AM (#61255562)
    BeauHD's brain immediately projects an obsession with racism onto faulty algorithms.

Keep up the good work! But please don't ask me to help.

Working...