Keeping Up with the EEOC: Artificial Intelligence Guidance and Enforcement Action – Gibson Dunn
May 23, 2022
Click for PDF
On May 12, 2022, more than six months after the Equal Employment Opportunity Commission (EEOC) announced its Initiative on Artificial Intelligence and Algorithmic Fairness,[1] the agency issued its first guidance regarding employers use of Artificial Intelligence (AI).[2]
The EEOCs guidance outlines best practices and key considerations that, in the EEOCs view, help ensure that employment tools do not disadvantage applicants or employees with disabilities in violation of the Americans with Disabilities Act (ADA). Notably, the guidance came just one week after the EEOC filed a complaint against a software company alleging intentional discrimination through applicant software under the Age Discrimination in Employment Act (ADEA), potentially signaling more AI and algorithmic-based enforcement actions to come.
The EEOCs AI Guidance
The EEOCs non-binding, technical guidance provides suggested guardrails for employers on the use of AI technologies in their hiring and workforce management systems.
Broad Scope. The EEOCs guidance encompasses a broad-range of technology that incorporates algorithmic decision-making, including automatic resume-screening software, hiring software, chatbot software for hiring and workflow, video interviewing software, analytics software, employee monitoring software, and worker management software.[3] As an example of such software that has been frequently used by employers, the EEOC identifies testing software that provides algorithmically-generated personality-based job fit or cultural fit scores for applicants or employees.
Responsibility for Vendor Technology. Even if an outside vendor designs or administers the AI technology, the EEOCs guidance suggests that employers will be held responsible under the ADA if the use of the tool results in discrimination against individuals with disabilities. Specifically, the guidance states that employers may be held responsible for the actions of their agents, which may include entities such as software vendors, if the employer has given them authority to act on the employers behalf.[4] The guidance further states that an employer may also be liable if a vendor administering the tool on the employers behalf fails to provide a required accommodation.
Common Ways AI Might Violate the ADA. The EEOCs guidance outlines the following three ways in which an employers tools may, in the EEOCs view, be found to violate the ADA, although the list is non-exhaustive and intended to be illustrative:
Tips for Avoiding Pitfalls. In addition to illustrating the agencys view of how employers may run afoul of the ADA through their use of AI and algorithmic decision-making technology, the EEOCs guidance provides several practical tips for how employers may reduce the risk of liability. For example:
Enforcement Action
As previewed above, on May 5, 2022just one week before releasing its guidancethe EEOC filed a complaint in the Eastern District of New York alleging that iTutorGroup, Inc., a software company providing online English-language tutoring to adults and children in China, violated the ADEA.[11]
The complaint alleges that a class of plaintiffs were denied employment as tutors because of their age. Specifically, the EEOC asserts that the companys application software automatically denied hundreds of older, qualified applicants by soliciting applicant birthdates and automatically rejecting female applicants age 55 or older and male applicants age 60 or older. The complaint alleges that the charging party was rejected when she used her real birthdate because she was over the age of 55 but was offered an interview when she used a more recent date of birth with an otherwise identical application. The EEOC seeks a range of damages including back wages, liquidated damages, a permanent injunction enjoining the challenged hiring practice, and the implementation of policies, practices, and programs providing equal employment opportunities for individuals 40 years of age and older. iTutorGroup has not yet filed a response to the complaint.
Takeaways
Given the EEOCs enforcement action and recent guidance, employers should evaluate their current and contemplated AI tools for potential risk. In addition to consulting with vendors who design or administer these tools to understand the traits being measured and types of information gathered, employers might also consider reviewing their accommodations processes for both applicants and employees.
___________________________
[1] EEOC, EEOC Launches Initiative on Artificial Intelligence and Algorithmic Fairness (Oct.28, 2021), available at https://www.eeoc.gov/newsroom/eeoc-launches-initiative-artificial-intelligence-and-algorithmic-fairness.
[2] EEOC, The Americans with Disabilities Act and the Use of Software, Algorithms, and Artificial Intelligence to Assess Job Applicants and Employees (May 12, 2022), available at https://www.eeoc.gov/laws/guidance/americans-disabilities-act-and-use-software-algorithms-and-artificial-intelligence?utm_content=&utm_medium=email&utm_name=&utm_source=govdelivery&utm_term [hereinafter EEOC AI Guidance].
[3] Id.
[4] Id. at 3, 7.
[5] Id. at 11.
[6] Id. at 13.
[7] Id. at 14.
[8] For more information, please see Gibson Dunns Client Alert, New York City Enacts Law Restricting Use of Artificial Intelligence in Employment Decisions.
[9] EEOC AI Guidance at 14.
[10] Id.
[11] EEOC v. iTutorGroup, Inc., No. 1:22-cv-02565 (E.D.N.Y. May 5, 2022).
The following Gibson Dunn attorneys assisted in preparing this client update: Harris Mufson, Danielle Moss, Megan Cooney, and Emily Maxim Lamm.
Gibson Dunns lawyers are available to assist in addressing any questions you may have regarding these developments. To learn more about these issues, please contact the Gibson Dunn lawyer with whom you usually work, any member of the firmsLabor and Employmentpractice group, or the following:
Harris M. Mufson New York (+1 212-351-3805, hmufson@gibsondunn.com)
Danielle J. Moss New York (+1 212-351-6338, dmoss@gibsondunn.com)
Megan Cooney Orange County (+1 949-451-4087, mcooney@gibsondunn.com)
Jason C. Schwartz Co-Chair, Labor & Employment Group, Washington, D.C.(+1 202-955-8242, jschwartz@gibsondunn.com)
Katherine V.A. Smith Co-Chair, Labor & Employment Group, Los Angeles(+1 213-229-7107, ksmith@gibsondunn.com)
2022 Gibson, Dunn & Crutcher LLP
Attorney Advertising: The enclosed materials have been prepared for general informational purposes only and are not intended as legal advice.
Go here to read the rest:
Keeping Up with the EEOC: Artificial Intelligence Guidance and Enforcement Action - Gibson Dunn