News > Article

INSIGHT: Hiring Tests Need Revamp to End Legal Bias

By Ron Edwards

Technology has radically changed the hiring process. Although appearing more sophisticated, new artificial intelligence systems might be moving hiring practices toward a less fair and less effective process. We should all be troubled by this.

As someone who spent nearly 40 years at the Equal Employment Opportunity Commission, I’ve seen how the use of hiring tools can negatively impact women, people of color, and those with disabilities.

In some scenarios, candidates are asked to record answers to questions and have their facial expressions analyzed using AI software. In others, “digital exhaust” information is mined by employers, allowing them to collect and consider information unrelated to the job in question, such as proxies for credit history or arrest and conviction information.

To end legal bias in hiring, 20th century laws need to be updated to address technology advancements.

1978 Guidelines allowed for Unfair Testing Outcomes

In 1978, the EEOC and other federal agencies came together to publish the Uniform Guidelines on Employee Selection Procedures (UGESP), which was designed to help employers understand how to comply with Title VII of the Civil Rights Act.

Title VII and Executive Order 11246 outlaws employment discrimination based on race, ethnicity, gender, religion, and national origin. UGESP allowed that some hiring tests could be acceptable, despite proven adverse impact, if the test is also proven to identify qualifications necessary for the job.

That provision, a good-faith effort on behalf of the government to avoid overburdening employers, has since been exploited. In many instances, I’ve seen evidence of employers utilizing assessments that are unfair and cause adverse impact. Many times, test developers have not validated their assessments in accordance with UGESP, rather utilizing different standards for validation developed by the assessment industry.

Today, many large employers use cognitive ability assessments to evaluate candidates. Research has demonstrated that for every 100 white candidates “passing” a cognitive ability test, only 32 African American candidates also pass. Regardless of their adverse impact, employers continue to use these assessments based on irrelevant and outdated validation evidence.

With the development of AI and algorithm-based tools, the disparities in our current hiring system could get worse. In one high-profile failure, Amazon built an AI hiring tool that filtered out women’s resumes for engineering positions. Similar situations at lesser known companies may well be playing out right now across the country.

New York City, California Bills Could Be Model

If we want workforce diversity to improve, we have to update our 20th-century laws to account for 21st-century technologies. California and New York City are considering legislation that would set standards for the use of selection methods including AI and algorithmic assessments in hiring.

Congress should consider a comparable law whether or not the state laws are passed to protect job applicants across the country and the president should sign an executive order immediately that would make federal contractors subject to these requirements.

Both pieces of legislation would require tools to be pre-tested for bias and audited on an annual basis to ensure no adverse impact on demographic groups. They would also require that candidates be notified that they are being screened and that they are made aware of the characteristics assessed by the tool.

This represents a critical step forward for three reasons. First, it provides an opportunity to encode the principles of UGESP into law so that traditional and algorithmic employment selection devices are held to the same standard of “fairness first.”

Second, it offers a pathway for employers to move on from clearly biased assessments, such as cognitive ability tests, and adopt fairer technologies.

Finally, the passage of these bills would provide much needed transparency to job candidates, who may not otherwise be aware that they have been subjected to a hiring screen.

Candidates deserve consideration in a manner that provides everyone with an equal chance to obtain a job, get a promotion, and be rewarded consistent with their talents. Employers should have confidence that their practices are fair, are in compliance with federal requirements and are generating labor pools that will increase productivity.

This column does not necessarily reflect the opinion of The Bureau of National Affairs, Inc. or its owners.

Author Information

Ron Edwards is the former director of the Program Research and Surveys Division with the Office of Research, Information and Planning at the Equal Employment Opportunity Commission (EEOC), where he oversaw the Employer Survey Collection system. He retired in 2017 and is currently a commissioner on the Alexandria City Commission on Employment.

Read article online here.