In an era where artificial intelligence (AI) is transforming every industry, hiring is no exception. From resume screening to video interview analysis, AI tools and automated decision systems are now central to how many companies evaluate job applicants.
But what happens when those systems discriminate? Increasingly, concerns are growing that these technologies may unintentionally favor or exclude certain protected classes — leading to potential employment discrimination under California law.
If you’re a California job seeker, understanding your rights under recent state regulations — and knowing how to identify unlawful discrimination — is essential.
At AMG Law, our employment attorneys are committed to protecting the rights of workers and applicants facing bias in hiring, whether caused by a human decision or an algorithm.
How AI Tools Are Used in Hiring
Modern California employers use a variety of automated systems to streamline the hiring process. These include:
- Resume-screening algorithms that rank candidates based on keywords or experience.
- Computer-based assessments and predictive assessments to evaluate an employee’s skills or personality traits.
- Online interviews analyzed through machine learning models that interpret facial expressions, tone of voice, and word choice.
- Automated decision systems that help hiring managers make final decisions about applicants.
In theory, these AI systems are designed to improve efficiency and remove human bias. In practice, however, they often replicate the same discriminatory patterns that anti-discrimination laws were created to prevent.
When Technology Reinforces Bias
While artificial intelligence can analyze vast amounts of data, it only performs as fairly as the information it’s trained on. If historical data reflects unequal treatment of targeted groups, the AI tools may perpetuate those biases.
For instance, if a company’s past hiring data underrepresents women, older applicants, or individuals from certain national origins, the system’s data processing techniques may “learn” that those applicants are less desirable.
Common examples of discriminatory outcomes include:
- Automatically rejecting applicants with foreign-sounding names (national origin bias).
- Lowering rankings for graduates of certain schools or regions.
- Using facial expressions or tone analysis that unfairly penalizes candidates with disabilities or speech differences.
- Directing job advertisements away from protected classes through targeted algorithms.
This kind of discrimination is prohibited under the Fair Employment and Housing Act (FEHA), which applies equally to human decision making and automated systems.
California’s New Focus on AI and Discrimination
California has taken proactive steps to regulate the use of AI in employment. In 2024, the California Civil Rights Council (CCRC) proposed new regulations under the Fair Employment and Housing Act to address bias in automated decision-making tools.
These regulations define AI-driven systems broadly, covering any computational process or data processing technique — including machine learning and predictive assessments — that facilitate human decision making related to employment decisions.
Under these new rules, California employers must:
- Conduct anti-bias testing and regular audits of AI tools.
- Maintain transparency about how automated decision systems are used.
- Ensure AI tools do not discriminate based on protected characteristics such as race, gender, age, disability, or religion.
- Allow applicants to request an alternative human oversight review when automated systems are used.
The CCRC’s rules align with growing federal attention from the Equal Employment Opportunity Commission (EEOC), which has issued similar guidance nationwide.
The “Robo Bosses Act” and California’s Legislative Push
The California Legislature is also considering the “Robo Bosses Act,” a bill that would regulate the use of AI systems in both hiring and ongoing workplace management.
This proposed law would require employers to:
- Disclose when automated decision systems are being used.
- Limit data processing to information relevant to employment decisions.
- Perform annual anti-bias testing to prevent discrimination.
- Demonstrate that AI tools truly facilitate human decision making rather than replace it.
If passed, it would add a new layer of accountability for employers using AI to screen applicants, analyze employee performance, or decide who receives an employment benefit.
What Counts as Unlawful Discrimination
California’s Fair Employment and Housing Act (FEHA) and Housing Act prohibit employment discrimination based on protected characteristics such as race, gender, disability, religion, sexual orientation, or national origin.
These protections apply equally whether the bias originates from a human supervisor or an algorithm. In other words, a company cannot escape liability by claiming, “The computer made the decision.”
If AI tools or automated decision systems reject job seekers or employees due to biased selection criteria, that may constitute unlawful discrimination under California law.
How Employers Are Expected to Respond
The California Civil Rights Council expects California employers to take reasonable steps to avoid unlawful discrimination when using AI.
That includes:
- Testing AI systems for discriminatory outcomes before and during deployment.
- Retaining documentation on data processing techniques and selection criteria.
- Conducting anti-bias testing and ongoing review of employee or applicant data.
- Ensuring that human oversight remains central to all employment decisions.
Failure to do so can expose a covered entity (such as a business or staffing agency) to discrimination claims and significant penalties.
Job Seekers’ Rights Under California Law
If you’re a job seeker in California, you have the right to be evaluated fairly — regardless of whether your application is reviewed by a person or a machine.
Under California law, you can:
- Request information about whether AI tools were used in your hiring process.
- File a complaint with the California Civil Rights Council if you suspect algorithmic bias.
- Pursue discrimination claims under the Fair Employment and Housing Act (FEHA).
- Seek damages, reinstatement, or other remedies if you were denied employment because of unlawful discrimination.
Even when AI systems are involved, employers must follow the same legal standards as with any other employment decision.
What Job Seekers Should Watch For
If you suspect that an employer’s AI tools played a role in unfairly rejecting your application, keep an eye out for signs such as:
- Unexplained rejections despite meeting job qualifications.
- Job ads that never appear for certain demographics.
- Online interviews evaluated by computer, focusing heavily on facial expressions or tone.
- Lack of transparency about how your application was processed.
Document everything — screenshots, job postings, and correspondence — as this may later serve as evidence in a potential discrimination claim.
How a California Employment Attorney Can Help
Navigating AI bias in hiring requires legal knowledge that combines both technology and employment law. A skilled personal injury and employment lawyer can help you determine whether you have grounds for a discrimination claim and guide you through your options under California law.
At AMG Law, our attorneys can:
- Investigate how automated decision systems were used in your case.
- Request disclosure of data processing techniques and internal AI audits.
- Identify violations of the Fair Employment and Housing Act (FEHA).
- File a claim with the California Civil Rights Council or pursue litigation against California employers.
We represent job seekers who’ve faced bias from AI tools, hiring managers, and other automated practices that fail to respect fairness and equality.
The Future of AI and Fair Employment
California is leading the nation in its effort to hold employers accountable for the use of AI in hiring decisions. As technology advances, both new regulations and similar proactive efforts will continue to shape how automated systems are governed.
While AI tools can improve efficiency, they must be designed and monitored to prevent discrimination, maintain transparency, and ensure AI tools don’t replace human decision making altogether.
The bottom line: Job seekers have rights — even in an age of robo recruiters and automated decision systems.
Protecting California Job Seekers in the AI Era
The rise of AI in hiring brings opportunities and risks. While automated decision making tools can streamline processes, they can also create unseen barriers for qualified applicants.
If you believe you’ve experienced discrimination from an employer’s use of artificial intelligence, it’s critical to act quickly. You may have rights under the California Civil Rights Council’s regulations and the Fair Employment and Housing Act (FEHA).
Contact AMG Law at 323-746-1853 today for a free consultation.
Our experienced employment attorneys will review your situation, explain your rights, and help you pursue justice if AI systems or automated decision tools were used unfairly during your hiring process.

