Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
AI United States

US Warns of Discrimination in Using AI To Screen Job Candidates (npr.org) 52

The federal government says that artificial intelligence technology to screen new job candidates or monitor worker productivity can unfairly discriminate against people with disabilities, sending a warning to employers that the commonly used hiring tools could violate civil rights laws. From a report: The U.S. Justice Department and the Equal Employment Opportunity Commission jointly issued guidance to employers to take care before using popular algorithmic tools meant to streamline the work of evaluating employees and job prospects -- but which could also potentially run afoul of the Americans with Disabilities Act. "We are sounding an alarm regarding the dangers tied to blind reliance on AI and other technologies that we are seeing increasingly used by employers," Assistant Attorney General Kristen Clarke of the department's Civil Rights Division told reporters Thursday. "The use of AI is compounding the longstanding discrimination that jobseekers with disabilities face." Among the examples given of popular work-related AI tools were resume scanners, employee monitoring software that ranks workers based on keystrokes, game-like online tests to assess job skills and video interviewing software that measures a person's speech patterns or facial expressions.
This discussion has been archived. No new comments can be posted.

US Warns of Discrimination in Using AI To Screen Job Candidates

Comments Filter:
  • I guess that AI thing didn't work out, better go back to humans so there's no discrimination!

    • training data (Score:4, Informative)

      by Comboman ( 895500 ) on Friday May 13, 2022 @02:17PM (#62530506)

      An AI is only as good as it's training data. If you feed it a bunch of resumes of "good" and "bad" job candidates, then it will sort any new resumes you give using the same criteria that the human trainer used to sort the training data (even if it doesn't know what those criteria were). Essentially it's just hiding the underlying biases in machine learning code.

      • AI is only as good as whichever is the worse out of it's training data and the bugs present in the training code.

      • by imidan ( 559239 )

        The last time I participated in hiring, it was for a basic API developer job. We got 50+ applicants, and easily 80% of them were Chinese guys with dubiously related skillsets (one guy was all about PCB lithography, another had a lot of experience doing electron microscopy). None of them claimed any significant programming skill, or any knowledge of what an 'API' was. We rejected them all. (In addition, we had no ability to provide work visas to international applicants.) I can easily see how an AI trained o

      • by mjwx ( 966435 )

        An AI is only as good as it's training data. If you feed it a bunch of resumes of "good" and "bad" job candidates, then it will sort any new resumes you give using the same criteria that the human trainer used to sort the training data (even if it doesn't know what those criteria were). Essentially it's just hiding the underlying biases in machine learning code.

        This, hence the law is targeted at those using the AI, not the AI itself (and I welcome our machine learning overlords... erm... just in case).

        The problem is, human based recruitment is pretty biased but said bias is hard to prove. You can't really say that they rejected every resume with a non-white sounding name (Asian, black, latino, pick you bias, we all know it happens) in a court of law because they can simply say that candidates were not suitable for and you've no evidence to back up your clam. H

      • I don't think anyone uses hiring AI like that. Hiring AI just looks at the relevant experiences asked for and gives the resume a score based on manually imputed perimeters. Like, Java experience, 5 pts per year. Experience with X required. Excetera.

    • It's amazing: all software is shit unto it might have biases in which case it's clearly bug free.

  • It is easy to guess that said AI is also biased against people over 40 years old. And in many other ways too.

  • It seems to me artificial intelligence is going to favor artificially intelligent people. You know the type, those that get hired but have no idea what they're doing.
  • It's happened before [theverge.com]! It'll happen again. Maybe with less-hilarious results, mind you.

  • AI, or rather Machine Learning, can actually learn hidden patterns. And that could lead to policy problems.

    It can, for example, learn that applicant ZIP code could be a good indicator to filter out bad candidates, or some other proxy metric. Yes, there would be "false negatives", in other words, very talented people being ignored.

    But the job of the hiring is not getting the best possibly people, but rather avoiding problem hires. Problem hires causes more headache than what they bring. Or, to say bluntly, i

  • If your job is intellectual but disability physical (or vice versa), then we're all good. But if I'd like you to dig a ditch and you have only one arm, your output won't be the same. It is therefore not surprising, that as soon as your disability has a potential impact on your output, employers start having a problem with that, and candidate assessement algorithms naturally take this factor into account.

    As cruel as this may sound, in the business world it is all about someone's output, and although big comp

  • ..is that you get out what you put in. All of it, warts 'n' all. If put a racist, misogynistic dataset in, whether you're aware of it or not, you get racist, misogynistic behaviour out of the AI. I think AI's a wonderful mirror to hold up to ourselves & see what we're really like.
  • Any fair method of choosing e.g. whom to interview is inherently unfair, because it is exceedingly unlikely that the pool of The Chosen from a fair method will have the same proportions of race, sex, sexual orientation, religion, ethnicity, age, national origin, political persuasion, and hairdo as the population as a whole. Therefore any fair method of selection is unfair. The solution to this is to use unfair methods of selection in order to make the process fair.

Your own mileage may vary.

Working...