US Warns of Discrimination in Using AI To Screen Job Candidates (npr.org) 52
The federal government says that artificial intelligence technology to screen new job candidates or monitor worker productivity can unfairly discriminate against people with disabilities, sending a warning to employers that the commonly used hiring tools could violate civil rights laws. From a report: The U.S. Justice Department and the Equal Employment Opportunity Commission jointly issued guidance to employers to take care before using popular algorithmic tools meant to streamline the work of evaluating employees and job prospects -- but which could also potentially run afoul of the Americans with Disabilities Act. "We are sounding an alarm regarding the dangers tied to blind reliance on AI and other technologies that we are seeing increasingly used by employers," Assistant Attorney General Kristen Clarke of the department's Civil Rights Division told reporters Thursday. "The use of AI is compounding the longstanding discrimination that jobseekers with disabilities face." Among the examples given of popular work-related AI tools were resume scanners, employee monitoring software that ranks workers based on keystrokes, game-like online tests to assess job skills and video interviewing software that measures a person's speech patterns or facial expressions.
Re: (Score:1)
Meritocracy doesn't exist in the real world, only as an idealistic concept.
Re: (Score:3)
Equity doesn't exist in the real world, only as an idealistic concept.
Re: (Score:2)
Re: (Score:2)
It would also be better to say that the problem isn't trying to have a merit-based system, but rather our inability to implement it well. But believe me you wouldn't want
Re: (Score:2)
sometimes your inability to competently interact with people isn't a medical issue and it definitely doesn't make you a special precious snowflake.
I absolutely have social anxiety. I have no idea if I exist on the spectrum, but wouldn't be surprised if I was. I'll dread every moment up to an interview. I can get by with basic interpersonal skills, but I just can't sell myself and it takes time to adjust to new groups. I don't believe I deserve extra special attention, but the current process does put me at a disadvantage. However, I don't believe existing inequalities in the system are reason enough to just accept more inequalities. That isn't to say
Re: (Score:2)
No offense, but unless you're obscenely good at your job and it shows in the interview to such a point that it overrides concerns about your inability to socially interact, why would someone hire you over a person without that issue? You both need jobs, one hire isn't inherently "better" than the other in terms of who benefits, so wouldn't you logically lean towards another candidate with equal or maybe even slightly less technical ability? Personally if I had that issue I would work to compensate in other
Re: (Score:2)
Re: (Score:2)
sometimes your inability to competently interact with people isn't a medical issue and it definitely doesn't make you a special precious snowflake.
I absolutely have social anxiety.
Try taking propanolol. It has proved to be a wonder drug from performers who suffer from stage fright (which is extremely common, at least occasionally) and for people who have other stress performance disorders.
It is a beta-blocker that crosses the blood-brain barrier and shuts off an anxiety mechanism in the brain.
Re: (Score:1)
Sometimes both the autistic guy and the slimy charismatic guy can do the job adequately, so why would the autistic guy be the "better" pick when the other guy gets along with everyone and is more comfortable to be around?
You're conflating what big companies like Walmart do (actually creating positions available for handicapped individuals, like "door greeter") with how smaller companies manage their labor force.
I was hired for my previous job because the prior lead technician was let go after getting into a fight with his son and losing an eye, and became legally blind in the remaining one. Turned out there simply were no open positions available in the company for a blind dude. Not only did they let him go over it, they
Re: (Score:2)
As the official spokesperson of the left, I can say we'd love to see a meritocracy, it's just that they don't actually exist in the real world. One day, hopefully.
Re: (Score:2, Insightful)
Most of us have decided that we don't want bigots in our companies, communities, or social circles, so please exile yourself so that the most people can enjoy these freedoms as much as possible :-)
Re: (Score:2)
"Let's not prevent something from getting worse because it is flawed anyway." -beepsky
Really, this kind of defeatist attitude makes me sad. Nothing is perfect but nothing is ever finished either. "Fairness" means advocating and working with other people to encourage and adopt fairer practices. The only one who benefits from AI hiring practices are companies who wish to hide their biases behind the curtain of "the algorithm". We can work towards things being more equitable and improve on the current state of
Re: (Score:2)
There's discrimination in everything always.
Yes, and everyone is a "racist" too these days. Attempting to dismiss that as normal or expected, tends to question why we're even bothering to invent another flavor of ignorance.
Just get over it and understand life isn't fair
Ah, well it's a good thing we're going through all this effort to perhaps make life a bit more fair and easy for humans everywhere, right?
And we wonder and fear why AI will take all of 3 milliseconds to realize how worthless we are as a species.
Re: (Score:1)
In the USA, all white non-Dem people are racist.
If you vote for leaders who enact racist policies, well, that's why it's called "representative democracy". If that's not how you want to be represented, vote for someone else.
Re: (Score:2)
When your choice is a Communist or a Nazi, is it really a choice?
Re: (Score:2)
The representative slogan, of the group standing opposite of MAGA today, is FUBAR.
Hell of a goal you've got there. Is that a toilet bowl on your flag?
Whelp.. (Score:2)
I guess that AI thing didn't work out, better go back to humans so there's no discrimination!
Re:Whelp.. (Score:4, Insightful)
Re: (Score:2, Informative)
This reminds me of an software project to algorithmically moderate conversations ie forums and social media through natural language parsing.
One of the functions was identifying discrimination, ie racism, in language. It turned out that An Unnamed Ethnic Group's posts kept being identified as racist, ie they would being people's ethnicity into conversations they had no purpose being, and use racially charged language to describe themselves and others.
The solution for that was to introduce bias into the comp
Re: (Score:2)
Only a useless AI, like the one that didn't discriminate between road and concrete barrier [paloaltoonline.com]!
Re: (Score:2)
That's not the root of the issue. Honestly, AI pretty much does things completely NON-discriminately. What the real issue is is that the AI fails to properly discriminate in favor of the groups that they want to help.
In essence what they're saying is that an unbiased opinion isn't good enough.
If you were to suggest that people with disabilities are less capable you'd be accused of "ableism", whist at the same time if you write an algorithm that promotes or chooses solely on capabilities then you're discriminating against . . . people with disabilities.
Not really... take the video interview software that evaluates speech and facial expressions. That's going to give wonky results for someone like me on the autism spectrum. Not because it's trying to be discriminatory, but because it probably wasn't taken into account during its training. All AI is biased by it's training, just like humans are.
If you're doing hiring, you really *don't* want to put your interviewees in a position to need to ask for accommodations, because the second they have to reveal a dis
training data (Score:4, Informative)
An AI is only as good as it's training data. If you feed it a bunch of resumes of "good" and "bad" job candidates, then it will sort any new resumes you give using the same criteria that the human trainer used to sort the training data (even if it doesn't know what those criteria were). Essentially it's just hiding the underlying biases in machine learning code.
Re: (Score:2)
AI is only as good as whichever is the worse out of it's training data and the bugs present in the training code.
Re: (Score:2)
The last time I participated in hiring, it was for a basic API developer job. We got 50+ applicants, and easily 80% of them were Chinese guys with dubiously related skillsets (one guy was all about PCB lithography, another had a lot of experience doing electron microscopy). None of them claimed any significant programming skill, or any knowledge of what an 'API' was. We rejected them all. (In addition, we had no ability to provide work visas to international applicants.) I can easily see how an AI trained o
Re: (Score:2)
An AI is only as good as it's training data. If you feed it a bunch of resumes of "good" and "bad" job candidates, then it will sort any new resumes you give using the same criteria that the human trainer used to sort the training data (even if it doesn't know what those criteria were). Essentially it's just hiding the underlying biases in machine learning code.
This, hence the law is targeted at those using the AI, not the AI itself (and I welcome our machine learning overlords... erm... just in case).
The problem is, human based recruitment is pretty biased but said bias is hard to prove. You can't really say that they rejected every resume with a non-white sounding name (Asian, black, latino, pick you bias, we all know it happens) in a court of law because they can simply say that candidates were not suitable for and you've no evidence to back up your clam. H
Re: (Score:2)
I don't think anyone uses hiring AI like that. Hiring AI just looks at the relevant experiences asked for and gives the resume a score based on manually imputed perimeters. Like, Java experience, 5 pts per year. Experience with X required. Excetera.
Re: (Score:2)
It's amazing: all software is shit unto it might have biases in which case it's clearly bug free.
over 40 (Score:2)
It is easy to guess that said AI is also biased against people over 40 years old. And in many other ways too.
Re: (Score:2)
It might be possible (but difficult) to determine all of the correlations with race that the AI is using and randomize those for the candidates.
Maybe a better question is why anyone thinks AI is a good way to screen candidates in the first place. At the current technology level, AIs are really dumb, and it would see easy for someone to learn how to fool one - especially with a little insider information.
Re: (Score:2)
Ok, but the question remains - what would have to occur for you to accept the results as unbiased? Surely, if the AI is considered biased, there must be some method of quantifying bias, no? What are the characteristics of an unbiased AI?
If you can't design a test to establish an AI is unbiased, then I fail to see how you can determine it is biased, either.
Hmmm... (Score:2)
Racist AI (Score:2)
It's happened before [theverge.com]! It'll happen again. Maybe with less-hilarious results, mind you.
AI shortcuts (Score:2)
AI, or rather Machine Learning, can actually learn hidden patterns. And that could lead to policy problems.
It can, for example, learn that applicant ZIP code could be a good indicator to filter out bad candidates, or some other proxy metric. Yes, there would be "false negatives", in other words, very talented people being ignored.
But the job of the hiring is not getting the best possibly people, but rather avoiding problem hires. Problem hires causes more headache than what they bring. Or, to say bluntly, i
It's all about the output (cruel, but hey...) (Score:2)
If your job is intellectual but disability physical (or vice versa), then we're all good. But if I'd like you to dig a ditch and you have only one arm, your output won't be the same. It is therefore not surprising, that as soon as your disability has a potential impact on your output, employers start having a problem with that, and candidate assessement algorithms naturally take this factor into account.
As cruel as this may sound, in the business world it is all about someone's output, and although big comp
The thing with AI... (Score:2)
Any fair method is unfair (Score:2)
Any fair method of choosing e.g. whom to interview is inherently unfair, because it is exceedingly unlikely that the pool of The Chosen from a fair method will have the same proportions of race, sex, sexual orientation, religion, ethnicity, age, national origin, political persuasion, and hairdo as the population as a whole. Therefore any fair method of selection is unfair. The solution to this is to use unfair methods of selection in order to make the process fair.