From the Left

/

Politics

The Long History of Discrimination in Job Hiring Assessments

The ACLU on

Applying for jobs can be a difficult and frustrating experience: You're putting forward your qualifications to be judged by a prospective employer. We all want to be treated fairly. We want our qualifications to speak for themselves. But for job seekers who have been historically excluded or discriminated against because of their race, gender identity or disability, there can be another question lurking in the background: Am I being judged, not for my ability to do the job, but for my identity?

Automated decision-making tools, including those using artificial intelligence and algorithms, have been widely adopted in hiring. Today, 7 out of 10 employers use them. We at the ACLU have previously written about AI and some of the newer ways that it's impacting hiring, including how it lacks transparency and can harbor serious flaws that lead to bias and discrimination. But these tools are just the latest frontier in a long history of employment tests that can discriminate and harm job seekers. For example, one of the landmark civil rights cases, Griggs v. Duke Power Co (1971), was about a company's use of bogus tests to block the promotion of Black workers.

When tests and tools that have a long history of problems are combined with new technologies such as AI, risks of harm only increase, exacerbating harmful barriers to employment based on race, gender, disability and other protected characteristics. While the harm of racial discrimination in employment tests has long been recognized and challenged, there has been less awareness about how these tests impact applicants who, in addition to facing racial discrimination, face discrimination based on their disabilities.

The use of personality assessments in hiring processes has become increasingly common. Yet these tests often ask general questions that may have little to do with the ability to do the job and capture traits that are directly linked with characteristics commonly associated with autism and mental health conditions such as depression and anxiety. This creates a high risk that qualified workers with these disabilities will be disadvantaged compared to other workers and may be unfairly and illegally screened out.

To push back, we filed a complaint to the Federal Trade Commission against Aon, a major hiring technology vendor, alleging that Aon is deceptively marketing widely used online hiring tests as "bias-free" even though the tests discriminate against job seekers based on traits such as their race or disability. The ACLU and co-counsel have also filed charges with the Equal Employment Opportunity Commission against both Aon and an employer that uses Aon's assessments on behalf of a biracial (Black/white) autistic job applicant who was required to take Aon assessments as part of the employer's hiring process.

Two Aon products, a "personality" assessment test and its automated video interviewing tool, which integrate algorithmic or AI-related features, are marketed to employers across industries as cost-effective, efficient and less discriminatory than traditional methods of assessing workers and applicants. However, these products assess very general personality traits such as positivity, emotional awareness, liveliness, ambition and drive that are not clearly job related or necessary for a specific job and can unfairly screen out people based on disabilities. The automated features of these tools exacerbate these fundamental problems, particularly as Aon incorporated artificial intelligence elements in its video interviewing tool that are also likely to discriminate based on disability, race and other protected characteristics.

Cognitive ability assessments, another staple in hiring, must also be subject to scrutiny, as they have long been shown to disadvantage Black job candidates and other candidates of color and may also unfairly exclude individuals based on disability. These tests, touted to measure aspects of memory, as well as several others it markets, have racial disparities in performance.

 

For autistic and other neurodivergent job applicants and applicants of color, cognitive ability assessments pose a significant barrier to employment. Not only do they fail to accommodate diverse needs, but they also perpetuate discrimination based on race, disability and other traits. Employers should not use assessments that carry a high risk of discrimination. Employers risk screening out people who could be successful employees, impacting diversity in the workplace, and could face legal liability, even where the assessments are designed and administered by third-party vendors.

Employers have a legal obligation to thoroughly vet any assessments they use for compliance with anti-discrimination laws, and if they decide to use an assessment, they must provide meaningful notice so that disabled workers can make an informed choice about whether to seek accommodations or alternative processes.

But vendors must also be accountable for the tools they market. Employers can hold vendors accountable by demanding that vendors truly design their products to be inclusive -- including by incorporating the perspectives and experiences of people with disabilities and other protected groups into their design process -- and conduct thorough auditing for discrimination based on race, disability and other protected characteristics. Employers can also demand transparency and decline to purchase their products if they fail to do so. And vendors can and should also be held legally accountable for their discriminatory products and deceptively marketing them. As the EEOC recently argued in a federal case about discrimination in an online hiring product, vendors can be held accountable under employment discrimination laws, and our FTC complaint should serve as notice to vendors that we will seek to hold them accountable under consumer protection laws as well.

As the hiring landscape continues to change and job applicants face new hiring tools, we must strive for a future where skills and potential, not bias, determines our opportunities. The ACLU stands ready to defend the rights of individuals wronged by discriminatory practices. Together, we can dismantle discriminatory barriers and build a more inclusive workforce for all.

Ricardo Mimbela is a Communications Strategist at the ACLU. Olga Akselrod is a Senior Staff Attorney in the Racial Justice Program at the American Civil Liberties Union, where she leads its work on algorithmic discrimination in employment and other economic opportunities and engages in advocacy for government actors to center civil rights in policies concerning artificial intelligence and other automated decision-making technologies. For more than 100 years, the ACLU has worked in courts, legislatures and communities to protect the constitutional rights of all people. With a nationwide network of offices and millions of members and supporters, the ACLU takes on the toughest civil liberties fights in pursuit of liberty and justice for all. To find out more about the ACLU and read features by other Creators Syndicate writers and cartoonists, visit the Creators website at www.creators.com.


Copyright 2024 Creators Syndicate Inc.

 

 

Comics

Andy Marlette Gary McCoy Bob Englehart Chris Britt Dave Granlund Mike Smith