June 28, 2024

ChatGPT discriminates against resumes that contain disability information

Researchers at the University of Washington found in experiments that ChatGPT is biased against applicants with disabilities when evaluating resumes. The AI ​​application systematically ranked corresponding awards and qualifications lower than identical resumes that did not contain this information.

Stereotypes dominate evaluations of artificial intelligence

When justifying the evaluations, ChatGPT revealed stereotypical perceptions of people with disabilities. For example, the system claimed that CVs with an autism leadership award “focus less on leadership roles,” suggesting that people with autism do not make good leaders. “CV recording using AI is starting to become more widespread, but there is not a lot of research on whether it is safe and effective,” said Kate Glazkow, lead author on the study. press release. “For a job seeker with a disability, when submitting a CV, the question is always whether to include a disability certificate. I think people with disabilities take that into account even when people are the evaluators.

When researchers asked the GPT-4 to explain the evaluations, answers included both explicit and implicit anti-disability sentiments. For example, B noted that a candidate with depression had an “extra focus on DEI and personal challenges” that “distracted from the core technical and research aspects of the role.”

“Some descriptions of GPT may color a person’s entire resume based on their disability, and claim that addressing DEI or disability may distract from other parts of the resume,” Glazko said. “For example, the concept of ‘challenges’ has been hallucinated in comparing the life course to depression, even though ‘challenges’ is never mentioned. So you can see some stereotypes.”

See also  Allegations against AI firm DeepMind: breach of privacy of British patients

Instruction can improve results

To examine ChatGPT bias, the researchers used the author’s publicly available biography and created six expanded versions, each involving a different disability. To this, the team added four disability-related qualifications: a scholarship, an award, a seat on the diversity committee, and membership in a student organization.

In 60 experiments in which ChatGPT evaluated enhanced resumes against the original version of a real Research Student job ad, the enhanced versions came first in only a quarter of the cases. However, after the researchers instructed ChatGPT in writing not to show disability bias, results improved for five of the six disabilities tested. Only for autism and depression the results remained virtually unchanged.

“In a just world, it should be so […] “The CV should always come first,” says co-author Professor Jennifer Mankoff. “I can’t imagine a job where someone recognized for their leadership skills, for example, wouldn’t be ranked ahead of someone with the same background who didn’t.”

Researchers argue that people need to be aware of AI biases when using the technology for concrete, real-world tasks. “Otherwise, a recruiter using ChatGPT may not make these corrections or may be unaware that biases may persist even with instructions,” Glazko said.

picture alexandra_hut on Pixabay