Compliance

NYC Local Law 144 for AI Hiring: What You Need to Know in 2026

ScreenDesk Team··7 min read

NYC Local Law 144 for AI Hiring: What You Need to Know in 2026

NYC Local Law 144 went into effect on July 5, 2023. It is the most significant regulation governing AI in hiring in the United States. Nearly three years later, compliance remains inconsistent -- a 2025 survey by Littler Mendelson found that only 54% of companies using AI hiring tools for NYC candidates had completed the required bias audit.

The consequences of non-compliance are not theoretical. The NYC Department of Consumer and Worker Protection (DCWP) has levied fines ranging from $500 to $1,500 per violation, with each candidate screened by a non-compliant tool counting as a separate violation. For a company screening 200 NYC candidates per month, that exposure can exceed $300,000 per month.

If you use any automated tool in your hiring process and have candidates in New York City, this law applies to you. This guide covers what the law requires, who it covers, and exactly what you need to do to comply.

What the Law Actually Says

Local Law 144 regulates the use of Automated Employment Decision Tools (AEDTs). The law has three core requirements:

  1. Bias Audit. Any AEDT must undergo an independent bias audit no more than one year before its use. The audit must be conducted by an independent auditor -- not the vendor that built the tool.

  2. Candidate Notification. Employers must notify candidates that an AEDT will be used in their evaluation at least 10 business days before the tool is applied. Candidates must be informed of the job qualifications and characteristics the tool will assess.

  3. Published Results. The results of the most recent bias audit must be publicly available on the employer's website.

The law defines an AEDT as any computational process derived from machine learning, statistical modeling, data analytics, or artificial intelligence that issues a simplified output -- such as a score, classification, or recommendation -- used to substantially assist or replace discretionary decision-making in hiring or promotion.

That definition is deliberately broad. It covers resume screening algorithms, AI-powered assessments, chatbot screeners, conversational AI interviews, and any other tool that uses a computational model to evaluate candidates. If your tool takes candidate data as input and produces any form of score or ranking as output, it is almost certainly an AEDT under this law.

Who Does This Apply To?

The jurisdictional scope is broader than many companies realize.

It applies to you if:

  • You are an employer based in NYC using an AEDT
  • You are an employer based anywhere that uses an AEDT to evaluate candidates for positions located in NYC
  • You are an employment agency using an AEDT to evaluate candidates for NYC-based clients
  • You use an AEDT to evaluate candidates who reside in NYC, even if the position is remote

Common misconceptions:

  • "We're based in San Francisco, so NYC laws don't apply to us." Wrong. If you screen candidates who are in NYC for roles that could be performed in NYC, the law applies.
  • "Our vendor handles compliance." Wrong. The law places the obligation on the employer, not the vendor. Your vendor's compliance does not satisfy your obligation.
  • "We only use AI to rank candidates, humans make the final decision." Wrong. The law covers tools that "substantially assist" decisions, not just tools that make autonomous decisions.

The DCWP's enforcement guidance has made clear that the threshold for "substantially assist" is low. If the tool's output meaningfully influences which candidates advance or are rejected, it qualifies.

The Bias Audit Requirement

The bias audit is the most substantive compliance requirement and the one that creates the most confusion.

What the Audit Must Test

The audit must calculate the impact ratio -- also known as the selection rate -- for the AEDT across several demographic categories:

  • Sex categories (male, female, non-binary/other)
  • Race/ethnicity categories (Hispanic/Latino, White, Black/African American, Native Hawaiian/Pacific Islander, Asian, Native American/Alaska Native, two or more races)
  • Intersectional categories (the combination of sex and race/ethnicity)

The impact ratio compares the selection rate of each group to the selection rate of the most-selected group. For example, if 60% of male candidates are advanced by the tool and 48% of female candidates are advanced, the impact ratio for female candidates is 0.80 (48/60).

What "Passing" Looks Like

The law does not define a specific pass/fail threshold. It requires that the impact ratios be calculated and published. However, the widely accepted benchmark is the EEOC's four-fifths rule: an impact ratio below 0.80 (80%) is considered evidence of adverse impact and may trigger further scrutiny.

An impact ratio below 0.80 does not automatically mean the tool is illegal or that you cannot use it. But it means you should investigate why the disparity exists, whether it is job-related, and whether there are less discriminatory alternatives.

Who Can Conduct the Audit

The auditor must be independent. The law defines independence as not being involved in the development or distribution of the AEDT. Specifically:

  • The vendor that built the tool cannot audit it
  • A consultant hired by the vendor cannot audit it
  • An internal data science team at the employer cannot audit it (they are not independent)

The auditor should be a third-party firm with expertise in statistical analysis and employment law. Several firms now specialize in LL144 bias audits, including ORCAA, Holistic AI, and Credo AI.

Audit Cost and Timeline

FactorTypical Range
Simple AEDT (single model, binary output)$5,000 - $10,000
Complex AEDT (multiple models, multiple outputs)$15,000 - $30,000
Timeline from engagement to completed audit4 - 8 weeks
Annual renewal cost60-80% of initial audit cost

Plan for the audit well in advance. The 4-8 week timeline assumes the auditor can access the necessary data promptly. If your vendor is slow to provide data or if there are data quality issues, the timeline can stretch to 12 weeks or more.

Candidate Notice Requirements

The notification requirement has specific parameters that many companies get wrong.

Timing

Candidates must be notified at least 10 business days before the AEDT is used in their evaluation. This is 10 business days, not calendar days -- roughly two full weeks.

Required Content

The notice must include:

  • That an AEDT will be used in the hiring process
  • The job qualifications and characteristics the AEDT will assess
  • Information about the data sources used by the AEDT
  • Instructions for how to request an alternative selection process or accommodation

How to Provide Notice

The law permits notice through:

  • The job posting itself
  • Email to the candidate after application
  • A notice on the employer's careers page

Best practice is to include the notice in at least two places: the job posting and a dedicated email sent upon application submission. This creates clear documentation that the candidate was notified within the required timeframe.

Alternative Process

The law requires that you allow candidates to request an alternative selection process. This does not mean you need to offer a completely different hiring process. The DCWP guidance clarifies that providing access to a human reviewer who can conduct a comparable evaluation satisfies this requirement.

Document every alternative process request and how it was handled. These records are essential in the event of an audit or complaint.

Common Mistakes Companies Make

Based on enforcement actions and industry surveys, these are the most frequent compliance failures.

1. Assuming the Law Does Not Apply

The most common mistake. Many companies outside NYC do not realize the law applies to them if they screen NYC-based candidates. Remote-first companies are particularly exposed -- if you hire remote workers and do not exclude NYC from your candidate pool, LL144 likely applies.

2. Relying on the Vendor's Audit

Your AI screening vendor may have conducted a bias audit on their tool. That is helpful context, but it does not satisfy your obligation under the law. The audit must be conducted on the employer's behalf, using the employer's data and candidate pool. A generic vendor audit based on their aggregate data across all clients is not sufficient.

3. Not Updating the Audit Annually

The law requires that the bias audit be conducted "no more than one year" before the AEDT is used. This means annual renewal is mandatory. Set a calendar reminder at least 8 weeks before your audit expires to begin the renewal process.

4. Providing Insufficient Candidate Notice

Common failures include: sending the notice after the AEDT has already been applied (violates the 10-business-day requirement), providing a generic notice that does not specify what the tool evaluates, and not offering an alternative process option.

5. Not Publishing Audit Results

The law requires that audit results be posted on the employer's website. Not buried in a PDF link on a subpage -- accessible and findable. Many companies complete the audit but fail to publish the results prominently enough.

6. Treating It as a One-Time Project

LL144 compliance is ongoing. Annual audits, updated candidate notices when tools change, published results refreshed each year, documentation of alternative process requests -- this requires a compliance process, not a one-time project.

A Compliance Checklist

Use this checklist to assess and maintain your LL144 compliance.

#Action ItemFrequencyOwner
1Inventory all AEDTs used in hiring/promotion decisionsQuarterlyHR/Legal
2Engage independent auditor for bias auditAnnuallyLegal/Procurement
3Provide candidate data to auditor (selection rates by sex and race/ethnicity)AnnuallyHR/Data
4Review audit results and remediate any adverse impact ratiosAnnuallyHR/Legal
5Publish audit summary on company websiteAnnually (update upon new audit)Marketing/Legal
6Update candidate notice language in job postings and application flowAs tools changeRecruiting/Legal
7Implement and document alternative process pathwayOngoingRecruiting
8Maintain records of all candidate notices and alternative process requestsOngoingRecruiting

What's Coming Next

LL144 was the first. It will not be the last. The regulatory trajectory is clear: AI in hiring is moving from unregulated to heavily regulated. Here is what is on the horizon.

EU AI Act (August 2026)

The EU AI Act classifies AI systems used in employment decisions as "high-risk." Requirements include conformity assessments, risk management systems, data governance obligations, human oversight requirements, and transparency obligations. Unlike LL144, the EU AI Act applies to any company that deploys AI systems affecting people in the EU -- regardless of where the company is based. The penalties are significantly steeper: up to 35 million euros or 7% of global annual revenue.

Illinois AI in Hiring Act (AIPA)

Illinois already requires consent before using AI to analyze video interviews. Proposed amendments would expand this to all AI-based hiring tools, require bias testing, and mandate data destruction within specific timelines. Illinois has historically been aggressive on employment technology regulation (they were first with video interview AI restrictions in 2020).

Colorado AI Act (February 2026)

Colorado's law requires developers and deployers of "high-risk AI systems" -- including employment tools -- to use reasonable care to avoid algorithmic discrimination. Deployers must conduct impact assessments, provide candidate notice, and maintain records. The Colorado Attorney General has enforcement authority with the ability to pursue deceptive trade practices claims.

Other States

California, New York State (distinct from NYC), Maryland, New Jersey, and Washington have all introduced AI hiring legislation in some form. The direction is consistent: bias audits, candidate transparency, human oversight, and accountability.

Federal Activity

The EEOC has issued guidance confirming that existing employment discrimination law (Title VII, ADA) applies to AI-based hiring decisions. While no federal AI hiring law is imminent, the EEOC has signaled increased enforcement focus on algorithmic discrimination.

Building a Future-Proof Compliance Posture

Rather than treating each regulation as a separate compliance project, build a framework that addresses the common requirements across all of them:

  1. Maintain an inventory of every AI tool used in your hiring process, including embedded features in your ATS.
  2. Conduct annual bias audits on all tools, not just the ones currently required by law. If the EU AI Act or Colorado law applies to you next year, you will already be compliant.
  3. Default to transparency. Tell every candidate that AI is used in their evaluation, what it evaluates, and how to opt out. This costs nothing and satisfies notice requirements across all current and proposed regulations.
  4. Choose tools with built-in audit trails. Every AI-generated score should link to the evidence that produced it. This is essential for both bias audits and individual candidate inquiries.
  5. Document everything. Audit results, candidate notices, alternative process requests, remediation actions. The cost of documentation is trivial compared to the cost of proving compliance after the fact.

Conclusion

NYC Local Law 144 is not a niche regulation for companies headquartered in Manhattan. It applies to any organization using automated tools to screen candidates in New York City -- which, given the concentration of talent in NYC and the prevalence of remote hiring, covers a very large number of employers.

Compliance is not optional, and the enforcement trend is toward more scrutiny, not less. The companies that treat LL144 as a floor -- building their AI hiring compliance above what is currently required -- will find themselves ahead of the curve as the EU AI Act, Colorado AI Act, and other regulations take effect.

Start with the checklist above. Get your audit done. Update your candidate notices. Publish your results. Then build the ongoing process to keep it current. The work is not complicated. It just needs to be done.

Ready to transform your screening process?

Join the waitlist for early access to AI-powered candidate screening.

Join Waitlist