Introduction
AI is transforming how companies hire, helping teams screen resumes faster, prioritize top candidates, and reduce manual work. But as more HR tech relies on automation, one issue keeps rising to the top:
Are these tools fair? Are they introducing bias? Are they even legal?
That’s where bias audits come in, and if you’re using AI in recruiting, they’re quickly becoming a legal requirement, not just a nice-to-have.
At Brainner, we believe responsible AI hiring starts with transparency and accountability. That’s why we’ve partnered with Holistic AI, a leading AI audit provider, to independently test and validate our system — so you can use Brainner confidently, knowing it meets the highest compliance and ethical standards.
Bias Regulations Are Already Here — And More Are Coming
If your company hires in the U.S., you’ve probably heard about new AI hiring laws being passed across states and cities. Here are some of the most notable:
New York City – Local Law 144
As of July 2023, any company using automated employment decision tools (AEDTs) — including AI resume screening — must:
- Conduct an annual bias audit by an independent third party
- Publish the results publicly
- Notify candidates that AI is being used in their evaluation
California – AB 331 (In Progress)
California is working on similar legislation, which would:
- Require impact assessments for automated decision tools
- Include transparency around data and logic used
- Provide candidates with the right to contest decisions made with AI
Other states (like Illinois and Maryland) already have laws in place related to video screening and algorithmic assessments — and more are coming soon.
🚫 Brainner’s No-Bias Approach to AI Resume Screening
From day one, we designed Brainner to support recruiters — not replace them, and to give full visibility and control over every screening decision.
Here’s how we minimize bias by design:
- No black-box scores: Every candidate is evaluated based on job-specific criteria you define, not a generic score.
- Human in the loop: Final decisions are always made by humans.
- Customizable weighting: You decide which requirements matter most — and how much.
- Transparent logic: Recruiters can always see why a candidate was ranked the way they were.
But we didn’t stop there.
✅ Independent AI Bias Audit — Powered by Holistic AI
To go beyond promises, we partnered with Holistic AI, a globally recognized provider of algorithmic audits, to test Brainner’s system for bias and fairness.
The result?
Brainner has successfully passed a formal AI bias audit, meeting the standards set by laws like NYC Local Law 144 — and helping our clients stay compliant while using AI responsibly.
Our bias audit includes:
- Statistical testing across protected characteristics (gender, ethnicity, etc.)
- Evaluation of selection rates and disparate impact
- Ongoing compliance readiness for future regulatory updates
We’re proud to be one of the first AI screening tools with this level of transparency and auditability.
Why It Matters for You
If you’re using AI to help screen resumes, you need to be able to:
- Prove your process is fair
- Respond to legal scrutiny
- Protect your employer brand and candidate trust
With Brainner, you can do that — with confidence.
👣 What’s Next: Compliance as a Standard
We believe AI bias audits will soon be standard practice across all recruiting tools. The companies that lead on this now will be ahead of the curve — in compliance, in fairness, and in hiring outcomes.
Brainner is proud to lead the way, and we’ll continue investing in transparency, ethics, and partnerships (like Holistic AI) to ensure our users are always protected.
💬 Want to Learn More?
If you’re a TA leader, legal/compliance stakeholder, or HR tech buyer evaluating tools for bias, we’d love to show you how Brainner works — and how we built fairness into every layer of the product.
Save up to 40 hours per month
HR professionals using Brainner to screen candidates are saving up to five days on manual resume reviews.