skip to main content
Overview
Toggle Button Open

June 15, 2023

By: Shelley M. Jackson and Elizabeth M. Roberson

Recently, the Equal Employment Opportunity Commission (“EEOC”) issued a new publication on the impact of software, algorithms, and artificial intelligence (“AI”) in employers’ decisions on who to hire, fire, and promote (collectively “Selection Decisions”). Employers are increasingly using tools outfitted with software, algorithms, and AI to help them in their Selection Decisions. The EEOC has chosen to refer to these tools as “algorithmic decision-making tools.” Resume scanners, employee monitoring software, video interviewing software, and job fit / personality tests all fall into the category of algorithmic decision-making tools. Employers using algorithmic decision-making tools must be aware of any “disparate impact” the algorithmic decision-making tools may have on their Selection Decisions.

A.    Disparate Impact and the Four-Fifths Rule

Title VII of the Civil Rights Act (“Title VII”) prohibits employers from using selection procedures that disproportionately exclude persons based on race, color, religion, sex, or national origin. The EEOC has determined that implementing algorithmic decision-making tools is a selection procedure. If the selection procedures disproportionately affect certain persons, then that procedure is determined to have a disparate impact. However, even if an employer’s selection procedures have a disparate impact, they are protected if they can show that the selection procedure is “necessary to the safe and efficient performance of the job.”

The EEOC highlighted the Four-Fifths Rule as the rule of thumb for showing that an employer’s selection procedures cause a disparate impact. The Four-Fifths Rule compares the selection rate of one group of people to another group of people. If the ratio of the two compared groups is less than four-fifths (80%), then it is a good sign that the selection procedures have a disparate impact. 

B.    Example of AI Causing a Disparate Impact

Anywhere USA Hospital (“Hospital”) is coming out of a high turnover year and is trying to hire dozens of nurses and more office staff. To help prescreen the candidates, Hospital purchases a subscription to the software service ABC. ABC conducts a personality test of candidates and uses a predictive AI model to determine which candidates Hospital is most likely to hire. Hospital receives applications from 60 White applicants and 40 Black applicants. Every applicant submits their resume and takes the personality test. Using its predictive AI model, ABC reviews the applicants’ answers to the personality test and determines that 48 of the White applicants and 16 of the Black applicants are more likely to be hired. Hospital removes the candidates that ABC did not identify as good matches and extends interviews to the 48 White and 16 Black applicants. 

    (48 White Applicants Pass Test / 60 White Applicants) = 80% Selection Rate
    (16 Black Applicants Pass Test / 40 Black Applicants) = 40% Selection Rate
    (40% Black Selection Rate / 80% White Selection Rate) = 50% Raito

Since ABC’s ratio of its Black and White selection rate is less than 80%, the ABC software violates the four-fifths rule. If it is determined that the ABC software was committing disparate impact based on race, and Hospital cannot show that the personality test was “necessary to the safe and efficient performance of the job,” then Hospital could be held liable under Title VII for using the program in a discriminatory manner.

C.    Conclusion

If employers choose to use software, algorithms, or AI in their Selection Decisions, they should ensure that results generated by such tools are carefully screened, as the EEOC will hold employers liable for unlawful discrimination arising from such results. Employers will be held liable for the disparate impact of the processes they implement to make Selection Decisions. Thus, employers should consider auditing their algorithmic decision-making tools and request assurances from vendors that the tools have been properly validated to avoid discriminatory results. 

Our attorneys will continue to monitor guidance from the EEOC related to the use of software, AI, and algorithms in hiring, promotion, and firing decisions. If you have questions about this publication or how it may impact your organization moving forward, please contact Shelley M. Jackson, Elizabeth M. Roberson, or another member of our Labor and Employment Law Practice.

The authors gratefully acknowledge the contributions of Griffin O’Gara, law student and Krieg DeVault summer associate, in assisting in the preparation of this client alert.


Disclaimer.  The contents of this article should not be construed as legal advice or a legal opinion on any specific facts or circumstances. The contents are intended for general informational purposes only, and you are urged to consult with counsel concerning your situation and specific legal questions you may have.
 

June 15, 2023

By: Shelley M. Jackson and Elizabeth M. Roberson

Recently, the Equal Employment Opportunity Commission (“EEOC”) issued a new publication on the impact of software, algorithms, and artificial intelligence (“AI”) in employers’ decisions on who to hire, fire, and promote (collectively “Selection Decisions”). Employers are increasingly using tools outfitted with software, algorithms, and AI to help them in their Selection Decisions. The EEOC has chosen to refer to these tools as “algorithmic decision-making tools.” Resume scanners, employee monitoring software, video interviewing software, and job fit / personality tests all fall into the category of algorithmic decision-making tools. Employers using algorithmic decision-making tools must be aware of any “disparate impact” the algorithmic decision-making tools may have on their Selection Decisions.

A.    Disparate Impact and the Four-Fifths Rule

Title VII of the Civil Rights Act (“Title VII”) prohibits employers from using selection procedures that disproportionately exclude persons based on race, color, religion, sex, or national origin. The EEOC has determined that implementing algorithmic decision-making tools is a selection procedure. If the selection procedures disproportionately affect certain persons, then that procedure is determined to have a disparate impact. However, even if an employer’s selection procedures have a disparate impact, they are protected if they can show that the selection procedure is “necessary to the safe and efficient performance of the job.”

The EEOC highlighted the Four-Fifths Rule as the rule of thumb for showing that an employer’s selection procedures cause a disparate impact. The Four-Fifths Rule compares the selection rate of one group of people to another group of people. If the ratio of the two compared groups is less than four-fifths (80%), then it is a good sign that the selection procedures have a disparate impact. 

B.    Example of AI Causing a Disparate Impact

Anywhere USA Hospital (“Hospital”) is coming out of a high turnover year and is trying to hire dozens of nurses and more office staff. To help prescreen the candidates, Hospital purchases a subscription to the software service ABC. ABC conducts a personality test of candidates and uses a predictive AI model to determine which candidates Hospital is most likely to hire. Hospital receives applications from 60 White applicants and 40 Black applicants. Every applicant submits their resume and takes the personality test. Using its predictive AI model, ABC reviews the applicants’ answers to the personality test and determines that 48 of the White applicants and 16 of the Black applicants are more likely to be hired. Hospital removes the candidates that ABC did not identify as good matches and extends interviews to the 48 White and 16 Black applicants. 

    (48 White Applicants Pass Test / 60 White Applicants) = 80% Selection Rate
    (16 Black Applicants Pass Test / 40 Black Applicants) = 40% Selection Rate
    (40% Black Selection Rate / 80% White Selection Rate) = 50% Raito

Since ABC’s ratio of its Black and White selection rate is less than 80%, the ABC software violates the four-fifths rule. If it is determined that the ABC software was committing disparate impact based on race, and Hospital cannot show that the personality test was “necessary to the safe and efficient performance of the job,” then Hospital could be held liable under Title VII for using the program in a discriminatory manner.

C.    Conclusion

If employers choose to use software, algorithms, or AI in their Selection Decisions, they should ensure that results generated by such tools are carefully screened, as the EEOC will hold employers liable for unlawful discrimination arising from such results. Employers will be held liable for the disparate impact of the processes they implement to make Selection Decisions. Thus, employers should consider auditing their algorithmic decision-making tools and request assurances from vendors that the tools have been properly validated to avoid discriminatory results. 

Our attorneys will continue to monitor guidance from the EEOC related to the use of software, AI, and algorithms in hiring, promotion, and firing decisions. If you have questions about this publication or how it may impact your organization moving forward, please contact Shelley M. Jackson, Elizabeth M. Roberson, or another member of our Labor and Employment Law Practice.

The authors gratefully acknowledge the contributions of Griffin O’Gara, law student and Krieg DeVault summer associate, in assisting in the preparation of this client alert.


Disclaimer.  The contents of this article should not be construed as legal advice or a legal opinion on any specific facts or circumstances. The contents are intended for general informational purposes only, and you are urged to consult with counsel concerning your situation and specific legal questions you may have.