|
10 min read

NYC Local Law 144 (AEDT): Plain-English Summary + Checklist

Last Update

January 8, 2026

NYC Local Law 144 (AEDT): Plain-English Summary + Checklist

Table of Contents

  • 1

    NYC Local Law 144 is the first U.S. law requiring bias audits of certain AI-driven hiring tools, making New York City a testing ground for AI accountability in recruitment. It requires companies to operate with far more transparency.

  • 2

    Local Law 144 was created to introduce accountability, transparency and fairness, ensuring AI enhances the hiring process instead of silently discriminating.

  • 3

    Find out what are the three main obligations under the law.

AI is changing how companies hire, but it also brings risks of bias and unfair practices. New York City has stepped in with a landmark rule that is NYC Local Law 144, also called the Automated Employment Decision Tools (AEDT) law. If you are a startup or recruitment agency using AI for hiring, this law directly affects you.

In this article, we will break down what it means, why it matters and the exact steps you need to stay compliant.

What Is NYC Local Law 144?

NYC Local Law 144 is the first U.S. law requiring bias audits of certain AI-driven hiring tools, making New York City a testing ground for AI accountability in recruitment. It requires companies to operate with far more transparency. Employers and agencies must:

  • Audit their AI hiring tools for bias.

  • Notify candidates whenever an Automated Employment Decision Tool (AEDT) is used.

  • Publish audit results publicly for applicants to review.

This law isn’t just paperwork. It is about protecting fairness. Its intent is clear to reduce hidden AI bias in hiring and make sure technology enhances opportunity instead of reinforcing inequality.

Why Was This Law Created?

AI hiring tools can scan resumes, rank candidates or even analyse video interviews. While efficient, they can unintentionally filter out candidates based on gender, race or other protected characteristics. For example, a resume screener trained on historical data might favour certain schools or job titles, leading to biased shortlists. Video analysis tools may misinterpret accents or penalize non-native English speakers. 

Lawmakers in NYC saw that without oversight, these technologies could amplify inequality rather than reduce it. Local Law 144 was created to introduce accountability, transparency and fairness, ensuring AI enhances the hiring process instead of silently discriminating.

Example: If an algorithm learns from past hiring data that skews male, it might unfairly rank male applicants higher than women. NYC lawmakers recognized this risk and passed Local Law 144 to increase transparency and accountability.

Who Must Comply with Local Law 144?

You must comply if:

  • You are an employer or employment agency using an AEDT for hiring or promotion decisions “in the city” (e.g., the job is located in NYC at least part-time, or a fully remote job associated with a NYC office).

  • You use AI tools to substantially assist in hiring or promotion decisions.

This includes:

  • Resume screening software.

  • Chatbots that pre-qualify candidates.

  • Video interview platforms using facial or speech analysis.

  • Ranking algorithms that score candidates.

If you use these tools, you fall under the scope of the NYC AI hiring law.

Key Requirements of Local Law 144

Here are the three main obligations under the law:

1. Bias Audits

  • You must conduct an independent bias audit of your AEDT every year.

  • The audit must check whether the tool discriminates by race, ethnicity or sex.

  • The auditor must be qualified and independent (not the tool provider or the employer).

2. Candidate Notice

  • You must give candidates at least 10 business days’ notice before using an AEDT.

  • Notices must explain:

    • That an AI tool is being used.

    • What the tool does (e.g., resume scoring, video analysis).

    • What job qualifications does the tool evaluate?

  • Candidates must also be told they can request an alternative process.

3. Public Disclosure

  • Results of the most recent bias audit must be posted on your website.

  • You must also disclose the AEDT’s data sources, data retention policy and types of data collected.

What Happens If You Don’t Comply?

Violating Local Law 144 can be costly:

  • Fines range from $500 to $1,500 per violation, per day.

  • Lawsuits from candidates are possible if they feel discriminated against.

  • Startups risk losing investor confidence and slowing down growth.

  • Recruitment agencies may lose client trust and contracts.

  • With other states considering similar rules, ignoring compliance now could leave you unprepared later.

Clients and candidates are already cautious about AI bias in hiring. So, proving compliance builds confidence and positions your business as credible and trustworthy.

Plain-English Compliance Checklist

Here’s a simple checklist you can use to make sure you’re on track:

Step 1: Identify Your AI Hiring Tools

  • Do you use resume screeners, chatbots or AI interview software?

  • Make a list of all tools used in hiring or promotion decisions.

Step 2: Schedule a Bias Audit

  • Hire an independent auditor.

  • Ensure the audit checks for race, ethnicity and sex bias.

  • Repeat the audit annually.

Step 3: Update Your Candidate Notices

  • Draft clear candidate notifications.

  • Send them at least 10 business days before using an AEDT.

  • Provide alternative options for applicants who opt out.

Step 4: Publish Audit Results

  • Add a page on your website disclosing:

    • Latest audit findings.

    • Types of data collected by the AEDT.

    • Data sources and retention policy.

Step 5: Train Your Hiring Team

  • Make sure recruiters and HR staff understand the law.

  • Document compliance efforts.

Step 6: Review Tools Regularly

  • Don’t rely blindly on vendors.

  • Ensure the tool provider supports compliance with Local Law 144.

Why Local Law 144 Sets the Tone for Future AI Regulations

NYC Local Law 144 is more than a city ordinance. It is a signal of what’s to come. Across the U.S. and globally, governments are watching how AI impacts hiring and whether it introduces bias. New York City became the first to take a bold step, but it won’t be the last. 

For startups and recruitment agencies, this means compliance is not just a box to check in New York. It is practice for the future of work everywhere. Companies that adapt now will find it easier to scale across multiple regions without disruption. By embedding fairness, transparency and accountability into your hiring systems, you build resilience against evolving regulations.

AI is here to stay and so is regulation. Treat Local Law 144 compliance as an investment, not an obstacle. It safeguards your reputation, protects your candidates and positions your business ahead of the curve in an industry that’s rapidly changing.

Common Questions Startups & Recruiters Ask

1. Do small businesses need to comply?

Yes. If you’re based in NYC and use an AI tool in hiring, even for a handful of roles, you must comply.

2. Does this apply outside NYC?

Currently, the law is specific to NYC, but other states and cities are considering similar rules. Complying now puts you ahead of future regulations.

3. What if I only use AI as a minor part of hiring?

If the AI tool substantially assists in decision-making, the law applies. Even if humans make the final call, AI filtering or scoring still counts.

4. Can tool vendors handle compliance for me?

No. Vendors may provide audits or documentation. But the legal responsibility is yours as the employer or agency.

5. Do I need to post audit results publicly?

Yes. Your most recent bias audit results must be posted on your website in a place accessible to candidates. Transparency is a key part of compliance.

Why This Matters for Startups & Recruitment Agencies

For startups, AI tools save time and make it possible to scale hiring with limited resources. But without compliance, they create significant legal and reputational risk. Recruitment agencies face even greater stakes; clients rely on them to deliver fair, unbiased hiring processes and expect strong legal safeguards. 

Demonstrating Local Law 144 compliance is not just about avoiding penalties. It is a way to differentiate your brand. It shows that you value fairness, transparency and long-term trust. In a competitive market, compliance becomes a selling point that helps win clients, attract better talent and keep your business future proof.

How TeamFill Helps You Stay Compliant

At TeamFill, we understand how complex compliance can feel. That’s why we:

  • Review and document your hiring tools.

  • Connect you with independent auditors.

  • Help draft candidate notices and disclosures.

  • Train your hiring managers for safe AI use.

But we don’t stop there. We also monitor regulatory updates so you’re never caught off guard as new rules emerge. Our team works with both startups and recruitment agencies, tailoring compliance strategies to your specific workflows and technology stack. With us, you can streamline hiring, reduce risk and build trust with candidates knowing you are always in line with NYC Local Law 144.

Final Thoughts

The NYC AI hiring law isn’t a passing trend. It is the first step in how AI regulations will shape recruitment across the U.S. Local Law 144 sends a clear signal: companies can no longer rely on AI tools without accountability. For startups, it is about protecting growth while building trust with candidates. 

For recruitment agencies, compliance is now part of your value proposition. Acting early shows you’re serious about fairness and transparency. Don’t wait until fines or reputational damage force your hand. Start building a compliant process today.

Not sure what you need? Paulius will help you find the best solution for your team.

Talk to a Teamfill expert

Collaboration image

Related Resources