Proctorio Is Using Racist Algorithms To Detect Faces (vice.com) 366
An anonymous reader quotes a report from Motherboard: Students of color have long complained that the facial detection algorithms Proctorio and other exam surveillance companies use fail to recognize their faces, making it difficult if not impossible to take high-stakes tests. Now, a software researcher, who also happens to be a college student at a school that uses Proctorio, says he can prove the Proctorio software is using a facial detection model that fails to recognize Black faces more than 50 percent of the time. Akash Satheesan, the researcher, recently published his findings in a series of blog posts. In them, he describes how he analyzed the code behind Proctorio's extension for the Chrome web browser and found that the file names associated with the tool's facial detection function were identical to those published by OpenCV, an open-source computer vision software library. Satheesan demonstrated for Motherboard that the facial detection algorithms embedded in Proctorio's tool performed identically to the OpenCV models when tested on the same set of faces. Motherboard also consulted a security researcher who validated Satheesan's findings and was able to recreate his analysis. [...]
Satheesan tested the models against images containing nearly 11,000 faces from the FairFaces dataset, a library of images curated to contain labeled images representative of multiple ethnicities and races. The models failed to detect faces in images labeled as including Black faces 57 percent of the time. Some of the failures were glaring: the algorithms detected a white face, but not a Black face posed in a near-identical position, in the same image. The pass rates for other groups were better, but still far from state-of-the-art. The models Satheesan tested failed to detect faces in 41 percent of images containing Middle Eastern faces, 40 percent of those containing white faces, 37 percent containing East Asian faces, 35 percent containing Southeast Asian or Indian faces, and 33 percent containing Latinx faces.
Satheesan tested the models against images containing nearly 11,000 faces from the FairFaces dataset, a library of images curated to contain labeled images representative of multiple ethnicities and races. The models failed to detect faces in images labeled as including Black faces 57 percent of the time. Some of the failures were glaring: the algorithms detected a white face, but not a Black face posed in a near-identical position, in the same image. The pass rates for other groups were better, but still far from state-of-the-art. The models Satheesan tested failed to detect faces in 41 percent of images containing Middle Eastern faces, 40 percent of those containing white faces, 37 percent containing East Asian faces, 35 percent containing Southeast Asian or Indian faces, and 33 percent containing Latinx faces.
"Racist Algorithms" (Score:3, Funny)
*keeps scrolling*
Re:"Racist Algorithms" (Score:4, Informative)
But now if students sitting at home don't light their face appropriately, the company trying to film them is the racist, or its algorithms are...
Cool story Beau.
Re:"Racist Algorithms" (Score:5, Insightful)
Racist might not be quite the right term since the word implies at least some malice which may not be present. It is, however, fair to say that it's use disadvantages black people. Continuing to use it now that the failure has been pointed out might fall under racist.
Of course, with the failure rates for all ethnicities, it's fair to say it's unfit for purpose.
Re: (Score:2)
Problem is Universities don't actually have any other options.
Re: (Score:3)
Why can't they just have a couple TAs watch videos? That's good enough for in-person.
Re: "Racist Algorithms" (Score:5, Insightful)
Its just harder to detect. Less contrast in the usual photoed range. Its not intentional and its not racism.
Solutions? Shoot the photos with settings that bring the faces near more to center values. But thats also racist...
Anyway the whole articles written as an attack at a company, while actually its just an observation of opencv's facial features functionality and default models available for it(which kinda suck)
Re: (Score:2)
Its just harder to detect. Less contrast in the usual photoed range. Its not intentional and its not racism.
Just because it's not intentional does not mean it's not racist. This is systematic use of an alogrithm which disadvantages people solely on the basis of race. That's racism. By definition.
A lot of racism can be unintentional. When the programmers said "we will train this algorithm using photos of white people (and by implication "it's not necessary to train it on black people, because black people don't matter") that's unintentional racism. When the company said "this software meets minimal standards wh
Re: (Score:3)
So, you're also saying that the people who made the webcams in the computers implemented settings that were optimized for white faces, and didn't care-- probably didn't even bother checking-- whether the settings work on black faces.
No, webcams are just really shitty little cameras and for any faces you need proper lighting, and faces with less contrast need even more. If you read the numbers, it's also racist against white people and in general the algorithm fails on roughly half the faces, racial differences are like rounding errors.
Re: (Score:3, Insightful)
Most racism is unintentional. I suspect that's why it's been so damn hard to stomp out.
This idea that all racism must be overt and intentional seems to exist only in this thread.
Re: (Score:2, Insightful)
I am sure that this was not the intent of the programmers who trained the pattern recognition. Nevertheless, it was racist by the definition of racism. Not bothering to train the system on black faces, and not bothering to care when the system doesn't work when the faces detected aren't white, is unintentional racism... but it's still racism.
Re:"Racist Algorithms" (Score:5, Informative)
> not bothering to care when the system doesn't work when the faces detected aren't white
It didn't perform much better on white faces either. Latinos were recognized the best, but in general the software just sucked and was based on some open source suite the developers didn't even write themselves, since it performed identically. If you just embed some random open source library into your app, are you racist?
Re:"Racist Algorithms" (Score:5, Insightful)
The limits of "black box" algorithms (i.e. not a closed form model that you could solve) such as this can only be probed empirically. That is exactly what the student has done, and by my lights it performs quite badly in all cases, but particularly badly for some racial groups. I applaud the student, and suggest that the headline is technically accurate: the outcome is that some racial groups are having a significantly harder (range: 57% - 33%) time than others with this shoddy product. That's the very definition of racist.
Re: (Score:3, Informative)
The models Satheesan tested failed to detect faces in 41 percent of images containing Middle Eastern faces, 40 percent of those containing white faces, 37 percent containing East Asian faces, 35 percent containing Southeast Asian or Indian faces, and 33 percent containing Latinx faces.
Actually the algorithm had a harder time detecting white faces than asian or latin faces. White faces were only slightly better detected than middle eastern faces.
when the system doesn't work when the faces detected aren't white
The whole point was that it did not detect a face at all, not that it detected a face and decided to ignore it because it wasn't white.
Since the algorithm failed to detect faces, it was not aware the race of the subject since it was not aware that a subject was present. For it to be racist, it would have to first detect the subject, second identif
Re:"Racist Algorithms" (Score:5, Interesting)
Actually the algorithm had a harder time detecting white faces than asian or latin faces. White faces were only slightly better detected than middle eastern faces.
Right, because actually white faces have less contrast. The issue with dark skin is not the lack of contrast, it's that the algorithm ignores dark parts of the image entirely. In fact sometimes the image is pre-processed to remove the darkest parts before the face detection algorithm even gets started.
That is why phones and computers (e.g. Windows Hello) use IR illumination for face recognition. Works with all skin tones in all lighting conditions. Proctorio couldn't be bothered to supply a suitable camera.
Re: (Score:3)
It's doesn't matter if it's racist (Score:5, Insightful)
Re: (Score:3)
It's time to ditch exams and move to coursework. Real work is like coursework, you don't have to sit at your desk with no books or internet access and try to figure out your job.
Darker faces? (Score:4, Informative)
Re: (Score:2)
I'm curious how a redneck with dark suntan would do.
Re: (Score:3)
a redneck IS the dark suntan ya dork.
Re: (Score:2)
Actually, the origin of the term has to do with red bandanna around the neck, of Presbyterian Covenanters of the 17th century. Descendants of these became Appalachian "hillbillies".
So I say let's put some sun-darkened white trash cracker honkies in front of the camera and see if the algorithm ignores them.
Re: (Score:2)
Though darker faces have a much higher contrast against a light background (most walls) so it's not obvious that they're a harder classification problem.
Plus, you have the fact that most webcams were designed with lighter complexions in mind so again, that's hard to say how much is physics vs bias.
I think there's only two things we can be certain of.
1) Blacks students encounter greater difficulties using Proctorio than students of other ethnicity.
2) Slashdot is apparently using flamebait headlines now.
Re: (Score:2)
In outline yes, in actual contrast across features, no. That's a known problem in automatic detection, and not a trivial one to fix; is the data values to look for are far less contrasted, there's a much greater error bar in a match/not match decision. And features are what you're looking for, not the outline, so yes, it is obvious it's a much harder classification problem, and it's also obvious you don't work in the field.
Also, webcams aren't designed with ANY complexion in mind. They're simple light sen
Re: (Score:2)
This. However, in defense of the earlier comment, the two things alleged to be certain are not affected by those errors, and AFAIK both are true.
Re:Darker faces? (Score:5, Insightful)
In outline yes, in actual contrast across features, no. That's a known problem in automatic detection, and not a trivial one to fix; is the data values to look for are far less contrasted, there's a much greater error bar in a match/not match decision. And features are what you're looking for, not the outline, so yes, it is obvious it's a much harder classification problem, and it's also obvious you don't work in the field.
I don't work in the field but I do have some familiarity with image recognition. Researchers developed solutions for the problems they targeted, detection of light skinned faces. If those techniques don't work as well for a different problem (dark skinned faces) the automatic assumption isn't that it's an insurmountable problem of physics.
Also, webcams aren't designed with ANY complexion in mind. They're simple light sensors (that's the fact).
For someone who just implied they work in image recognition you might want to have a closer look at how your images are being generated. The problem of cameras/webcams being designed and calibrated for light complexions is well know [hackernoon.com].
Now, it's perfectly plausible that if things were reversed and dark-skinned Africans were running the world economy and white Europeans were generally poor and uneducated we'd have the same problem. But if that were the case I suspect facial recognition would work much better on black people and people would be talking about the difficulty of detecting washed out white faces.
Column A, column B (Score:2)
Horrible, but not what I'd call racist (Score:5, Insightful)
I mean I know the term in this context is a joke. Being unable to deal with dark colors is a typical difficulty, one that the only good solutions for require different types of cameras to work around. Why not just say, the software sucks at recognizing faces. It's not like the programmers are failing because they don't like black people, it's because it is harder to teach a computer to spot the difference between shadow and skin when the contrast is lesser.
all that being said... Proctorio should aknowledge the problem in the software with the fact that a 1% chance of failing someone who did everything right is way too high, let alone what would come out to be at least 10%. That looks like a job the AI is certainly not ready to work in.
Re: (Score:3, Funny)
The headlines clearly says the algorithm is racist, not the programmers who wrote it.
I suspect that that the algorithm was raised in Jim Crow south and it is therefore a product of its environment. Clearly it needs to attend some diversity training sessions and learn to counteract it's inherent white privilege.
Re: Horrible, but not what I'd call racist (Score:2)
Look up the word racism, and don't stop till you get to the end.
Re: (Score:2, Insightful)
Why not just say, the software sucks at recognizing faces[?]
Because that wouldn't allow for the disingenuous virtue signaling that has been running rampant for the last several years.
Re: (Score:3, Insightful)
Intent isn't required for something to be racist.
Re:Horrible, but not what I'd call racist (Score:4)
But the word racist is so loaded at this point and used across so many contexts that I would argue it implies intent. A white supremacist is a racist. A bad algorithm that fails for people with dark skin is racist. Do you really want to put those two things in the same basket? This is buggy software that doesn't work when used with people with dark skin, which has led to discriminatory outcome when used for exam surveillance. I think that is a more accurate and informative assessment, and is lot less loaded, than simply calling it a "racist algorithm."
On a related note: when I was a software developer, most companies I worked for did not think of people with disabilities when they developed software. This is a little off-topic, but in a thread like this it deserves mentioning, because I think software development more often than not leads to discriminatory outcomes for people with disabilities.
Re: (Score:2)
It's not like the programmers are failing because they don't like black people, it's because it is harder to teach a computer to spot the difference between shadow and skin when the contrast is lesser.
Aha! You are now racist by association, for defending accused racists with facts and logic! Burn the witch!
The problem isn't the software (Score:2)
Re: (Score:2)
So it's racist against everyone (Score:2, Funny)
Including made-up nonexistent ethnicities like LatinX and SpaceX and TampaX
Marketing surveys show that the X makes people want to buy more.
Re: So it's racist against everyone (Score:2)
X gone give it to ya
RIP dark man X
Re: (Score:2)
I've actually seen the phrase "Latinx women" used in a print magazine. I don't know who they're trying to avoid offending, because Spanish already had the word "Latina". I thought trying to learn the language was respectful. Certainly in this case, it's easier than making up some tortured structure that doesn't fit into either language. How do you pronounce it, La-Tinks?
Another weird one getting thrown around is "Bipoc"... I'd be very interested in someone trying to explain that one, because I'd like to get
Re: So it's racist against everyone (Score:2)
It's funny in the gallows humor sense.
One set of people insist absolutely on saying "latina" in the Spanish way with the un-aspirated t in there.
A second group will flip a shit and scream "cultural appropriation" if a non-hispanic person says it in the Spanish way.
And yet a third will insist that both white and hispanic people speaking English say "latinX" in this bastardized combination with the accent on the second syllable, in the Spanish style, but with the aspirated t in the English style.
The war of th
"racist algorithms" oh my sides (Score:2, Insightful)
An algorithm that performs poorly with dark skin isn't racist, what nonsense. We need to come up with a good slur for submitter BeauHD, how about "woketard"?
Re: (Score:3, Insightful)
Why isn't it racist? Do you think that racism requires intent? Why?
Overtly racist people, active members of white supremacist groups, claim that they're not racists. I have to wonder if racism is *ever* intentional.
Wha...? (Score:5, Informative)
The algorithm is racist? Leaving aside that algorithms can't be any more racist than a piece of dog shit, I kind of doubt the people behind the open source project deliberately built it that way. Darker skin faces have inherently less contrast to work with, thus facial features are inevitably going to be harder to detect.
Nobody decided to make it that way, that's just how it is. Even commercially sold facial recognition software has this problem. Only a moron would attribute this to malice.
Re: (Score:3, Insightful)
Not as malice but as Gross Negligence. I.E. the difference between Murder and Manslaughter.
It is not the nature of black skin, but the total lack of training on black skin.
This is caused by morons that train the software on their own face and the face of the people employed by their company. The people working at their company are all white for some reason. What could that be, you ask?
Racism. Racism is why they had not a single freaking black person in their company, so when the racists started their
Re: (Score:2)
> Racism. Racism is why they had not a single freaking black person in their company, so when the racists started their company using racist hiring practices, that racism got incorporated into the algorithm.
TFW you can't distinguish prime-quality satire from woke rantings.
Re:Wha...? (Score:4, Insightful)
It takes a bigot to simply assume they (a) only trained on employees at their company, (b) their employees are "all white", and (c) the reason for that is racism.
And it takes a really moronic bigot to assert those things when the TFA says -- not even a third of the way down -- that Proctorio apparently uses nothing more than stock OpenCV models for face detection.
They do deserve to be skewered, but for using totally off-the-shelf models and claiming to have secret sauce rather than for active racism.
Re: (Score:2)
Re: Wha...? (Score:2)
So... the white people programmed an algorithm that was best at detecting Latinos? Really? Jesus you're fucking stupid.
Re: (Score:2)
Intent is not required for something to be racist.
Do you think a racist school segregation policy from the 1950s intended to do anything? It's a piece of paper.
The school segregation policy was intended (Score:5, Insightful)
"Racism" or "racist" does imply intent. It is a pejorative term and much scorn and punishment is due to those accused of it, so it should not be used lightly or inaccurately.
"Racially disadvantaging" or "racially inequitable" or similar would be more appropriate terms for non-sentient artifacts/systems used without malintent but having the inequity-causing effect.
Re: (Score:3)
Your attempt to shrink the definition of racism is an attempt to not deal with the enormous number of problems we have that do not have direct malice behind them.
Just because the software developers here didn't mean it doesn't mean that this software does not cause a great deal of harm. Kids are failing tests because the software is racist. Tacking on more and more words to try and soften the blow to you does way more harm.
Sure it can (Score:3)
TL;DR; Garbage in Garbage out.
Skin colour is not a 'Race" (Score:2, Informative)
Ethnicity is not a race.
Language is not a race.
Religion is not a race.
Nothing is a race when it comes to humans. There is just ONE human race.
When will the USA stop exporting this unscientific ridiculous divisive propaganda on the world?
Re: (Score:3, Informative)
Wrong. There is indeed a definition of "race", look it up. There are several human races. They each have distinctive characteristics cause by tens of thousands of years of isolation without interbreeding. There are medical conditions unique to each race, and plenty of peer reviewed biology and medical texts affirm this.
Quit "virtue signalling" by spewing this wrong bullshit. The real world doesn't function according to your woke bullshit.
Re: Skin colour is not a 'Race" (Score:2)
If you think race comes from genes, you don't understand race.
Re: (Score:2)
Race is a defined term. You are the one who understands nothing
You think you know more than the scientists at the CDC who say race is a risk marker for covid19? Or the economists who say that race is a risk marker for poverty? No, you don't. You are just spewing some nonsense that sounds good to you. You are ignorant.
Not tens of thousands of years (Score:2)
e.g. the ability to digest milk that the Proud Boys are so proud of is likely due to milk (specifically preserved cheese) being the only available food for a few dozen generations and selection pressures kicking in. Do the same thing to China and watch them eat more cheese.
I guess m
Re: (Score:2)
Adult milk drinking/lactase persistence is a great example of the gene flow effects [wikipedia.org] that I was talking about a couple of comments up. It evolved at least five times, and each variant diffused into neighbouring populations, gradually getting less common as you get further from the origin. One variant flowed between northern Europe and Central Asia, and gets less common as you go south in Europe. Another variant flowed between the Middle East and eastern Africa. Another variant flowed between eastern Afri
Re: (Score:2)
So are those racially-unique medical conditions racist? Since they actively disadvantage one race more than another, they would seem to fit the definition being used here.
Re: (Score:3)
They each have distinctive characteristics cause by tens of thousands of years of isolation without interbreeding.
This is factually wrong. I have no idea how you got upvoted as "Informative". Example one: Africans carry surprising amount of Neanderthal DNA [sciencemag.org]. Earlier research missed this because they assumed that migration was out-of-Africa only, and applied the assumption of zero Neanderthal DNA to determine European Neanderthal DNA percentages. This led to a mystery: Why did Asians have more Neanderthal DNA than Europeans did? The mystery was resolved when they compared African genomes directly to Neanderthal geno
Factually wrong, part 2 (Score:3)
Example 3: A map of lactase persistence [wikipedia.org].
Example 4: The geography of sickle cell disease [annsaudimed.net]: "The sickle cell gene is now known to be widespread, reaching its highest incidence in equatorial Africa, but occurring also in parts of Sicily and Southern Italy, Northern Greece, Southern Turkey, the Middle East, Saudi Arabia, especially the Eastern Province, and much of Central India."
Example 5: Distribution of blood types [palomar.edu].
Notice how genetic variants find their way around. Notice how their prevalence gradually
Been there, done that (Score:3, Interesting)
I used to work at a place that made, among other things, porn filters. The first rev looked for "flesh tones" and yep, Black and Asian porn got through it.
I forget if they ever got to the point of being sophisticated enough to weed out gray scale porn, but they fixed the races. There was nothing special AFAIK. I think it was just a huge Bayesian, and yes I did get to look at some porn at work. No, it wasn't fun. After Slutty Jane pops up in the test results a few hundred times, it's just business.
Re: Been there, done that (Score:2)
How did it do with ascii pr0n?
Re: (Score:2)
Porn has a lot more features to match than just face. Flesh tone was a bad match for porn anyway, which is why most filters don't actually look for that.
Not racist. (Score:2)
No matter your race, if you have albinism then it's going to detect you. The problem with the algorithm is that it was not tested on a wide enough selection of pigment color, which is not race specific. Claiming it's racist is disingenuous because the software is literally incapable of detecting race. It would be best described to say the software is flawed in that it has difficulty detecting faces with dark features.
This isn't the first time this has happened in face recognition software and it's not go
Re: (Score:2)
Wouldn't even say the software is flawed. It works as well as it can. It's just a much harder problem to detect a lower contrast (errors creep up massively).
I suspect the answer would lie in using a specific spectrum other than visible light to match against (i.e. infrared or some other band), but that'd need far more expensive hardware.
Stop playing the Racist card. (Score:2)
Stop using visual spectrum cameras, go to infrared which doesn't need high contrast surroundings (or something something).
Darker objects are harder to detect unless every camera is in a very well lit environment. If I get sunburn and someone else doesn't get sunburn in the same conditions, is the Sun racist?
Re: Stop playing the Racist card. (Score:2)
is the Sun racist
Perhaps. Western conceptions of astronomy are suspect. "Colonial science" they call it. https://m.thewire.in/article/t... [thewire.in]
Arithmetic is probably racist. https://www.theatlantic.com/ed... [theatlantic.com]
Newton's laws are racist. https://www.city-journal.org/t... [city-journal.org]
I suppose it's hard to argue that gravity is racist. I mean, it Literally Pulls The Black Man Down.
Re: (Score:2)
Stop using visual spectrum cameras, go to infrared which doesn't need high contrast surroundings (or something something).
FaceID seems to work quite well regardless of skin tone, so that might work. But there are few home computer environments equipped with the proper hardware for this, so the cost would be pretty high. Perhaps the schools could provide the equipment, but there are a lot of logistics issues to overcome before it is widely available.
Re: (Score:2)
FaceID uses special 3D scanning hardware rather than a simple camera.
Algorithms can't be racist, can they? (Score:5, Insightful)
Racism requires the belief that one race is better than others. Can an algorithm believe something?
Forgetting that, just compare the recognition failure rates:
Black faces: 57%
Middle Eastern: 41%
White: 40%
East Asian (oriental): 37%
Southeast Asian or Indian: 35%
Latinx: 33%
Either the programmer's name was some racist named Juan Rodriguez, or the algorithm just kinda sucks with really dark or really light skin and needs more work.
Re: (Score:2)
Which definition are you using? Please post a link.
Re: (Score:3)
This one. [merriam-webster.com]
The more important point is still how bad the algorithm is with all races with really light or darker faces.
Re: Algorithms can't be racist, can they? (Score:2)
Did you bother to read the whole thing?
Re: (Score:2)
Re: (Score:3)
No, racism simply requires that you treat different races differently.
Definition 2a: https://www.merriam-webster.co... [merriam-webster.com]
the systemic oppression of a racial group to the social, economic, and political advantage of another
Nothing about intent, only effect.
So does this cheating detector systematically oppress some racial groups? Absolutely - it makes it much harder for some non-white races to pass surveillance to take their exam.
Does doing so benefit another racial group? If the class grades on a curve, definitely (and they almost all do) - all the people who had no problems with facial recognition have th
You conveniently skipped Definition 1 (Score:2)
...from that exact same page [merriam-webster.com]:
That omission of #1 wasn't on purpose, was it?
Re: You conveniently skipped Definition 1 (Score:2)
Someone needs a lesson in deductive reasoning.
You stated that racism "requires" belief, then posted a link to a page with a definition that contradicts you.
Quick lesson: just because A implies X, that doesn't mean (A or B) implies X. Those are different things.
Re: (Score:2)
Re: Algorithms can't be racist, can they? (Score:2)
Allowing for the sake of argument the inclusion of the definition of "systemic racism" under the entry for "racism" (the disingenuous sockpuppetry of asking for a change in the dictionary and then using that definition to back your argument as if it were independent confirmation notwithstanding), an inanimate object cannot oppress you. The man wielding it certainly can, but the object itself has no moral worth.
A gun can threaten or defend.
A knife can cut off a foot for trying to escape or cut up a roast ser
Re: Algorithms can't be racist, can they? (Score:2)
Yes yes, we know the woke crazies got the people at MW to change the definition to cater to their personal brand of racism.
Re: (Score:3)
it makes it much harder for some non-white races to pass surveillance to take their exam.
Why are you calling out "non white" ? Based on the results in the article, white faces are middle of the road. It actually makes it harder for whites to pass the surveillance than asian or latin faces.
It actually makes it EASIER for some non-white races to pass surveillance to take their exam.
Why is this being framed as white vs black racism, why isn't it being framed as latin vs black racism?
Black faces: 57%
Middle Eastern: 41%
White: 40%
East Asian (oriental): 37%
Southeast Asian or Indian: 35%
Latinx: 33%
Re: (Score:3)
XKCD had a lot to say about the redefinition of words. https://xkcd.com/1860/ [xkcd.com]
I always think that when someone calls someone else racist because they don't like tapioca or something equally inane (which the majority of uses are these days).
Re: (Score:2)
Racism requires the belief that one race is better than others. Can an algorithm believe something?
Forgetting that, just compare the recognition failure rates:
Black faces: 57% Middle Eastern: 41% White: 40% East Asian (oriental): 37% Southeast Asian or Indian: 35% Latinx: 33%
Either the programmer's name was some racist named Juan Rodriguez, or the algorithm just kinda sucks with really dark or really light skin and needs more work.
Yep. Exactly. It's no more "racist" than the average IQ differences documented in The Bell Curve. which, believe it or not, was not authored by "racist" Mandarin Chinese or Ashkenazi Jews.
Facts are not "racist", because they are not beliefs.
It really doesn't (Score:5, Informative)
This is basically high tech Broken Windows policing. Do some googling and you'll find the history of that is completely tied up with harassing minority neighborhoods to keep them in their place. It's the same deal.
Re: (Score:2)
What does it say about the developers who publish the software if they've created something that is, by many accounts, incapable of serving it's most basic function against wide ranges of people?
It probably says that they are fairly superior to the average developer since not only did they produce software that compiles, they managed to deploy it to more than a few systems.
Most developers barely manage to do the first thing.
Only the most elite, absolute top tier developers ever manage to create software that is fit for purpose and meets requirements. That is so rare as to be almost mythical.
Re: (Score:2)
Good (Score:4, Interesting)
Good. Now we can sue schools to stop them from using this shitty software.
Failing isn't racist, failing to catch it is (Score:4, Insightful)
Distracted: Abuse/Commercialisation of open source (Score:3, Interesting)
The original author had made a controversial blog post, but their sentiment was aimed at obfuscation, lies, and deceit about the technologies being proprietary, and being approved for use by those instutitions. Their post summarises with Audits audits...
The motherboard website is spinning the original post, but they did some useful work:
"On its website, Proctorio claims that it uses “proprietary facial detection” technology. It also says that it uses OpenCV products, although not which products or what for. When Motherboard asked the company whether it uses OpenCV’s models for facial recognition, Meredith Shadle, a spokesperson for Proctorio, did not answer directly. Instead, she sent a link to Proctorio’s licenses page, which includes a license for OpenCV."
"racist"? (Score:3)
Why the unnecessary hyperbole calling the algorithm racist.
Racism would imply that it actually *does* detect a face of a specific race and then actively chooses to ignore it.
If it simply fails to detect a face, then how is it being racist? In order to be racist you have to actually identify that a subject is present, then identify the race of said subject, then make an active decision to treat that subject differently depending on the identified race.
This algorithm has not even identified the presence of a subject!
It's not racist, it's a flawed algorithm, or supplied with flawed training data. There is a very important distinction to make.
"The models Satheesan tested failed to detect faces in 41 percent of images containing Middle Eastern faces, 40 percent of those containing white faces, 37 percent containing East Asian faces, 35 percent containing Southeast Asian or Indian faces, and 33 percent containing Latinx faces."
Based on this information, if you're accusing the algorithm of racism then you're accusing it of being a latinx supremacist algorithm, since it achieves inferior results for everyone else.
Students of color.. (Score:2)
On August 4th, 1997, Skynet became self-aware (Score:3)
And it really, really, really didn't like black people.
Racist (Score:3, Insightful)
They should try running the same algorithm against a set of anglo-saxon people wearing glasses. They'll find out that the algorithm is "racist" toward them too, thereby showing that the people who developed the algorithm, presumably nerds, are a self-hating group.
Re: (Score:3)
Eliminating distinctions is the goal of followers of the Frankfurt School of philosophy, renamed "Critical Theory", which claims that all distinctions are imposed by those in power to denigrate and control the masses and that no amount of evidence can prove a theory, since all evidence is collected based on the oppressive political goals of the researcher.
It makes it difficult when science seeks to measure or predict based on distinctions, including race, age, gender, health, size, body shape, or attractive
Re: (Score:3)
Wonder if the program ran into a "race condition" (Score:4, Funny)
Thank you, thank you, I'll be here all evening!
The light is racist (Score:3)
Basically, if algorithms have some troubles identifying black people, it's because their pictures lacks many details...
The picture is based on the light received from the light reflected by the photographied subject. The darker the subject, the less light is reflected.
So basically, we have
Face recognition is racist because it fails to identify black faces
Algorithms fails to identify black faces because of poor quality pictures (lacks of details
Picture is of poor quality because there is not enough light reflection from the face
Hence Light is racist...
Not just algorithms (Score:3)
It's not just the algorithms. It's the very hardware. The hardware is the happy enabler of the racist algorithm. If it weren't for the hardware, the algorithms couldn't practice their racism. The hardware is just as culpable, for it is not an anti-racism ally.
Algorihms are racist. (Score:3)
Re: How can an algorithm be racist? (Score:4, Insightful)
We're talking about people who crave racism, thus they see it everywhere. They'll expend thousands of words in some junk journal to argue that their inability to find a particular and obscure foreign food at the shops is systemic racism. Should shops respond to the paper, making an effort to stock the food, the cultists will then accuse them of cultural appropriation.