California’s new regulations regarding AI went into effect on October 1, 2025. The new regulations impact the ways in which companies are permitted to use any automated decision-making tools in hiring and employment decisions. It requires employers to carefully consider how these tools are currently being used and to take positive steps to reduce the risk of bias.
In this post, we’ll outline these new regulations and a compliance roadmap for employers, along with some suggested practices.
Regulations Prohibit Employers from Using Automated Decision Systems (“ADS”) or Selection Criteria that Discriminate Applicants or Employees as of October 1, 2025
These new regulations amend the California Fair Employment and Housing Act (FEHA) and apply to employers who ‘regularly employ five or more employees’ and that use (or who hire third parties who use) automated systems to make or aid hiring and employment decisions.
Specifically, the new regulations:
- Prohibit discriminatory impacts that stem from the use of automated decision systems (ADS), as they relate to protected characteristics under FEHA. The focus on impact is a key change because it could mean that even unintentional discriminatory actions expose your company to legal risk.
- Outline documentation and notice requirements for employers that use ADS in hiring and employment processes to show testing for bias against protected groups.
As a reminder, the protected characteristics in California are:
- Race, color
- Ancestry, national origin
- Religion, creed
- Age (40 and over)
- Disability, mental and physical
- Sex, gender (including pregnancy, childbirth, breastfeeding or related medical conditions)
- Sexual orientation
- Gender identity, gender expression
- Medical condition
- Genetic information
- Marital status
- Military or veteran status
- Reproductive health decision-making.
What Are Automated Decision Systems?
The regulations define Automated Decision Systems as “a computational process that makes a decision or facilitates human decision making regarding an employment benefit, as defined in section 11008(i) of these regulations.”
ADS may be derived from and/or use:
- Artificial intelligence
- Machine-learning
- Algorithms
- Statistics
- Other data processing techniques.
(1) Automated-Decision Systems perform tasks such as:
(A) Using computer-based assessments or tests, such as questions, puzzles, games, or other challenges to:
(i) Make predictive assessments about an applicant or employee;
(ii) Measure an applicant’s or employee’s skills, dexterity, reaction-time, and/or other abilities or characteristics;
(iii) Measure an applicant’s or employee’s personality trait, aptitude, attitude, and/or cultural fit; and/or
(iv) Screen, evaluate, categorize, and/or recommend applicants or employees.
(B) Directing job advertisements or other recruiting materials to targeted groups;
(C) Screening resumes for particular terms or patterns;
(D) Analyzing facial expression, word choice, and/or voice in online interviews; or
(E) Analyzing employee or applicant data acquired from third parties.
Notably, the regulations exclude from the ADS definition the following:
Word processing software, spreadsheet software, map navigation systems, web hosting, domain registration, networking, caching, website loading, data storage, firewalls, anti-virus, anti-malware, spam- and robocall-filtering, spell-checking, calculators, database, or similar technologies, provided that these technologies do not make a decision regarding an employment benefit.
Action Items for Companies Using ADS Tools or Third Parties in Hiring Processes
The first and most important practice is to ensure an employee is managing any automated system in place. This is a prudent practice generally speaking, but it is now essential in hiring and employment decisions.
In addition, there are five key compliance steps companies should implement immediately, provided ADS are being used in hiring process:
Inventory and Audit All Automated Decision-Making Systems.
You first need to know which tools are deployed in your hiring and employment decisions. Note that this law doesn’t only apply to resume screening through to the point of hiring. It also applies to promotions, performance evaluations, compensation, benefits, and similar employment decisions.
For each ADS, document how and why it is used, the specific employment decision it impacts, the level of human oversight in the process, and the data it uses to make decisions (e.g. scoring, labelling, predicting outcomes). Companies may also wish to prioritize and focus on each tool in order of risk, based on factors such as how often the tools are used and the quality of the data used to train the AI.
Implement and Document Anti-Bias Testing.
This step is important because evidence of anti-bias testing or similar efforts to avoid unlawful discrimination can help mitigate risk. The law outlines that “the quality, efficacy, recency, and scope of such effort, the results of such testing or other effort, and the response to the results” will be relevant to any defense advanced in the event of a legal claim.
The scope of testing isn’t outlined in the regulations, but companies may choose to proactively run separate adverse impact analyses for each ADS used and consider whether outcomes show any potential biases against protected classes. For each test, the methodology should be documented carefully.
Update Recordkeeping and Data Retention Processes and Policies.
The regulations require that employers keep relevant records for four years, We would suggest employers update their internal processes to ensure that records are stored for any ADS, including inputs, outputs, selection metrics, training data, override logs (from the employee managing the process), audit results, and any actions taken to remediate identified biases.
Given that using a third party to make these decisions is not a defense, it’s also important to collect and store information about any third parties’ usage of ADS.
Outline Accommodation Request Processes.
Companies should have a process in place for individuals to request reasonable accommodations when an ADS is used. This should be clearly communicated to those impacted by employment decisions, as well as your internal team that uses the ADS.
Additionally, it’s important to ensure that your ADS do not elicit information about protected characteristics throughout the assessments because it could constitute an unlawful inquiry.
Train Your Team and/or Update Communications.
All relevant team members who are involved in employment decisions, as well as any relevant third parties, must know and understand these new rules and how the rules impact their work processes.
This may require additional training or internal communications, alongside possible changes to internal policies and external notices to applicants or employees whenever ADS tools are used.
Additional Resources:
We have previously covered the use of AI in hiring processes in a number of earlier blog posts. Consider reading those for more information:
Updates to the California Consumer Privacy Act Regarding ADS Use
Companies covered by California’s Consumer Privacy Act also have new laws relating to automated decision making in hiring. The new regulations impact any companies that use automated decision making systems (called automated decision-making technologies or ADMT in the regulations) to process personal information where computation is used to replace or substantially replace human decision-making.
While these changes are more detailed than we are covering here, it is worth noting that businesses impacted by these new regulations must:
- Conduct a risk assessment before using the tool, noting whether the risks to consumer privacy outweighs the benefits from the processing.
- Provide notice to consumers about the use of the technologies before use.
- Provide an opt-out option for consumers.
- Allow consumers to request information about the use of these technologies.
- Allow consumers to appeal the results.
These changes take effect January 1, 2026, though the provisions relating to automated decision-making technologies are not effective until April 1, 2027.
A Note About SB 7 – California’s No Robot Bosses Act
Finally, SB 7 aimed at prohibiting employers from solely using AI to make decisions regarding employee discipline and/or termination, while also imposing stringent notice requirements for AI use in hiring and employment-related decisions.
SB 7 was passed in both houses but vetoed by Governor Newsom on October 13, 2025. He noted that while he shares “the author’s concern that in certain cases unregulated use of ADS by employers can be harmful to workers. However, rather than addressing the specific ways employers misuse this technology, the bill imposes unfocused notification requirements on any business using even the most innocuous tools. This proposed solution fails to directly address incidents of misuse.”
If your company currently uses automated decision-making tools and hasn’t recently updated its internal processes and policies to reflect the changing legal landscape, you may be facing increased risk. Reach out if you need assistance, our employment counsel are available to help.
Disclaimer
The materials available at this website are for informational purposes only and not for the purpose of providing legal advice. You should contact your attorney to obtain advice with respect to any particular issue or problem. Use of and access to this website or any of the e-mail links contained within the site do not create an attorney-client relationship between CGL and the user or browser. The opinions expressed at or through this site are the opinions of the individual author and may not reflect the opinions of the firm or any individual attorney.