Litigation

Automation Accountability: What the Latest Class Action Ruling Means for Corporate AI

A recent federal court decision allowing a collective action over AI-driven employment screening underscores the expanding liability landscape for companies that rely on automated decision-making systems.
Automation Accountability: What the Latest Class Action Ruling Means for Corporate AI

A Wake-Up Call on AI-based Decisionmaking

Last week a U.S. District Court green-lit a collective action suit against Workday, Inc. accusing it of using an artificial-intelligence résumé filter that allegedly discriminated against older applicants.

Workday operates a platform that allows businesses to post job notices and accept applications. In its marketing materials, Workday claimed that its technology "utilizes artificial intelligence to parse an employer’s job posting and an applicant’s application and/or resume; extract skills in the employer’s job posting, on the one hand, and skills from the application and/or resume on the other hand; and determine the extent to which the applicant’s skills match the role to which they applied."

The plaintiffs, who are all over the age of 40, sued under the federal Age Discrimination in Employment Act ("ADEA") alleging that they received automated rejection notices from over 100 positions for which they met the stated qualifications. The decision marks one of the first times a court has allowed plaintiffs to challenge an AI-based screening filter at scale. Notably, this case was brought not against the prospective employer, but against the vendor of the HR platform used by the employer.

Why Liability Is Growing

  1. Accountability of AI Providers – Courts are increasingly just as likely to hold the providers of AI technology solutions liable as the employers who use their solutions, based on representations the providers make about the functionality of their products.
  2. Delegated Discrimination – Traditional anti-bias statutes (Title VII, ADEA, ADA, etc.) apply even if the discrimination is carried out by code rather than people. Automating a biased process does not immunize an employer or vendor.
  3. Negligent Deployment – Beyond intentional discrimination claims, plaintiffs are pleading negligence: failing to validate an AI tool, ignoring disparate-impact testing, or relying on vendors without contractual assurances.

Practical Takeaways for Companies

Looking Ahead

The court’s willingness to certify a class signals that AI-related disputes will no longer be confined to isolated individual claims. As regulators and plaintiffs’ lawyers gain fluency in machine-learning concepts, companies that treat AI as a black-box shortcut will face mounting legal exposure. The safest path forward is to treat algorithmic decision-making with the same rigor—and perhaps more transparency—than any other high-risk business process.

Back to Blog

Who is Dev Legal?

Sabir Ibrahim

Managing Attorney

During his 18-year career as an attorney and technology entrepreneur, Sabir has advised clients ranging from pre-seed startups to Fortune 50 companies on a variety of issues within the intersection of law and technology. He is a former associate at the law firm of Greenberg Traurig, a former corporate counsel at Amazon, and a former senior counsel at Roku. He also founded and managed an IT managed services provider that served professional services firms in California, Oregon, and Texas.

Sabir is also co-founder of Chinstrap Community, a free resource center on commercial open source software (COSS) for entrepreneurs, investors, developers, attorneys, and others interested in open source software entrepreneurship.

Sabir received his BSE in Computer Science from the University of Michigan College of Engineering. He received his JD from the University of Michigan Law School, where he was an article editor of the Michigan Telecommunications & Technology Law Review.

Sabir is licensed to practice in California and before the United States Patent & Trademark Office (USPTO). He is formerly a Certified Information Privacy Professional (CIPP/US).

Sabir Ibrahim, Managing Attorney

What can Dev Legal do for you?

Areas Of Expertise

We aim to advise clients in a manner that minimizes noncompliance risks without compromising operational efficiency or business interests. The areas in which we assist clients, either alone or in collaboration with affiliates, include:

Technology License Agreements

Drafting, reviewing, and negotiating software licenses, SaaS agreements, and other technology contracts.

Open Source Software Matters

License compliance, contribution policies, and open source business strategy.

SaaS Agreements

Subscription agreements, terms of service, and service level agreements for cloud-based services.

Intellectual Property Counseling

Trademark, copyright, and patent strategy for technology companies.

Product Counseling

Legal review of product features, marketing materials, and compliance with regulations.

Terms of Service and Privacy Policies

Creating and updating legal documents for websites and applications.

Assessment of Contractual Requirements

Reviewing obligations and ensuring compliance with complex agreements.

Information Management Policies

Data governance, retention policies, and information security procedures.

Risk Mitigation Strategy

Identifying legal risks and developing strategies to minimize exposure.

Join Our Email Newsletter List And Receive Our Free Compliance Explainer

Our one-page Dev Legal Compliance Explainer is an easy-reference guide to understanding compliance concepts for you or your clients. Our email newsletter includes information about news and recent developments in the technology regulatory landscape and is sent approximately once a month.

Contact Us

Get In Touch

Phone

510.255.3766

Mail

PO Box 721
Union City, CA 94587