Why Do Automated Hiring Systems Use an Oil Pressure Gauge to Measure Speed?

Roland Behm
5 min readJun 7, 2022

--

Third-party providers of automated hiring systems and their employer customers have recklessly ignored the plain language of the ADA to illegally discriminate against persons with disabilities for more than 30 years.

On February 29, 2019, HireVue, Inc. (HireVue) filed a patent application entitled “Detecting Disability and Ensuring Fairness in Automated Scoring of Video Interviews.” The patent application describes a process for detecting and addressing any “adverse impact” on persons with disabilities. Adverse impact measurement is also referred to as the 4/5th or 80% rule.

The patent application reads:

With this knowledge, the digital interviewing platform may then perform adverse impact mitigation with reference to the disability by iteratively removing problematic features from predictive interview models until the adverse impact is sufficiently mitigated, e.g., allows disabled persons to score about the 80 % (or 4 / 5ths) mark when compared with those that are not disabled in the same way.

While the patent application demonstrates HireVue’s awareness of the risks of disability-based employment discrimination arising from the use of automated hiring systems, it also demonstrates a failure to understand that “adverse impact” is not the relevant standard under the ADA. The relevant standard is, and has been for more than 30 years, whether the system screens out or tends to screen out an individual with a disability or a class of individuals with disabilities.

Adverse impact is the standard for measuring discrimination under Title VII of the Civil Rights Act based on age, gender, nationality, race, and other protected classes. The steps taken to avoid Title VII discrimination are distinct from the steps needed to address the problem of disability bias. An automated hiring system that purports not to have an adverse impact does not mean that the system could never screen out an individual with a disability. Each disability is unique. An individual may fare poorly on an assessment because of a disability, and be screened out as a result, regardless of how well other individuals with disabilities fare on the assessment.

As the Equal Employment Opportunity Commission (EEOC) states in its technical assistance document entitled, “The Americans with Disabilities Act and the Use of Software, Algorithms, and Artificial Intelligence to Assess Job Applicants and Employees:”

One of the “most common ways that an employer’s use of algorithmic decision-making tools could violate the ADA are . . . [t]he employer relies on an algorithmic decision-making tool that intentionally or unintentionally “screens out” an individual with a disability, even though that individual is able to do the job with a reasonable accommodation. “Screen out” occurs when a disability prevents a job applicant or employee from meeting — or lowers their performance on — a selection criterion, and the applicant or employee loses a job opportunity as a result.

The plain language of the ADA does not require a statistical showing of disparate impact upon a group of individuals with disabilities. Rather, the words and phrases – “screen out or tend to screen out” and “an individual with a disability or a class of individuals with disabilities” — confirm that ADA discrimination claims may be supported by a broad range of evidence that the challenged practice in fact functions to screen out an individual or a class of individuals on the basis of their disability.

While evidence may include statistics, it may also include the experience of the plaintiff, expert and non-expert testimony about disabilities and their effects, and even a common sense causation analysis regarding the inevitable impact of particular policies upon persons with certain types of disabilities.

For example, persons with autism spectrum disorder (ASD), persons diagnosed with depression and schizophrenia, and persons with traumatic brain injury may have a flat affect. Persons who have a flat affect do not show the usual signs of emotion like smiling, frowning, or raising their voice. It does not take any great expertise to determine that a person with a flat affect will be screened out by automated hiring systems that use facial and spoken word analyses.

In layperson’s terms, HireVue is attempting to use an oil pressure gauge to measure speed in its patent application. Why? HireVue has no “speedometer;” it has no tool to measure whether its systems screen out or tend to screen out individuals with disabilities.

HireVue has no way of demonstrating whether its systems are fit for purpose — making hiring decisions in compliance with the ADA — and employers who use HireVue systems know, or should know, about this HireVue system failure given their legal obligations to select those systems in a manner that ensures the system accurately reflect the skills, aptitude or whatever other factors that the system purports to measure, rather than reflecting an applicant’s impairment, as do the HireVue systems.

An employer who chooses to use a hiring technology like HireVue’s must ensure that its use does not cause unlawful discrimination on the basis of disability. The ADA bars discrimination against people with many different types of disabilities, including diabetes, cerebral palsy, deafness, blindness, epilepsy, mobility disabilities, intellectual disabilities, autism, and mental health disabilities. A disability will affect each person differently.

As stated by the Department of Justice in Algorithms, Artificial Intelligence, and Disability Discrimination in Hiring:

Employers should examine hiring technologies before use, and regularly when in use, to assess whether they screen out individuals with disabilities who can perform the essential functions of the job with or without required reasonable accommodations.

For example, if a county government uses facial and voice analysis technologies to evaluate applicants’ skills and abilities, people with disabilities like autism or speech impairments may be screened out, even if they are qualified for the job.

If a test or technology eliminates someone because of disability when that person can actually do the job, an employer must instead use an accessible test that measures the applicant’s job skills, not their disability, or make other adjustments to the hiring process so that a qualified person is not eliminated because of a disability.

--

--