Inside California's Automated Decision Making Regulation Proposal
The California Privacy Protection Agency (CPPA) released draft rules on automated decision making technology (ADMT) this week, setting the stage for the state of California to be a guiding beacon on how to regulate this technology as we move deeper into the age of AI.
The proposed regulations are not yet binding in any way, as the agency published the draft before its December 8 meeting to spur board discussion and public participation as the proposal continues to evolve. In any case, this initial version sets the bar high for how companies need to handle automated decision making, with almost no American precedent.
While the White House published an extensive Executive Order on the development and use of AI in October, there is currently no real path toward significant federal enforcement. Although data privacy and protection are featured heavily in that document, California is in a much better position to regulate data privacy matters given the existence of both the CPPA and the CCPA.
Regardless of how different the final version of this regulation is, the CPPA has put a definitive exclamation point on a busy year that saw the state pass nine privacy bills, including major amendments to CCPA, the revolutionary Delete Act, as well as other legislation on data brokers, digital health data, and reproductive health data.
Another bill passed was AB-302, which mandates that the Department of Technology inventory all high-risk automated decision making systems in use within the state. Governor Newsom signed that bill in October, as the privacy community has been waiting patiently for this CPPA draft on ADMT.
The situations the regulation will cover are also to be finalized, but the proposal suggests it covers a “decision that produces legal or similarly significant effects concerning a consumer”, including but not limited to topics like employment decisions, profiling for targeted advertising, and profiling consumers in publicly available spaces (i.e. things like facial recognition systems).
ADMT used for security, safety, fraud detection, or to fulfill requested goods or services are exempt.
Here are the specific requirements detailed in the draft regulation:
Pre-Use Notice Requirement
“The Pre-use Notice shall inform consumers about the business’s use of automated decisionmaking [sic] technology and consumers’ rights to opt-out of, and to access information about, the business’s use of automated decisionmaking [sic] technology.”
In detailing how this pre-use notice must work, businesses must:
- Make it readily available and accessible
- Show the notice to users before processing their personal data using the ADMT
- Notify users of their opt-out and access rights
- Be written in plain language explaining the purpose of the technology, the logic the ADMT uses to produce outputs, and how the business plans to use that output
Businesses must also provide additional resources where people can learn more about automated decision making technology, with the regulation suggesting a hyperlink be included in the pre-use notice.
Frankly, the requirements here will likely require quite a lengthy notice, so the CPPA must take the steps to make sure any instituted notices do not quickly fall prey to cookie fatigue, with consumers simply accepting the notice because they are sick of seeing it everywhere.
Naturally, any business using ADMT that does not fall under an exemption must provide people with the ability to opt-out of the ADMT.
If a consumer does opt-out, the business must stop processing the consumer’s personal data through the ADMT within 15 business days, and notify any third party they have sold or shared the data with to honor the opt-out.
Once a consumer opts out, the business cannot ask them for consent to use the ADMT for the next 365 days.
- Be accessible through an interactive form as well as at least one other method, such as physical mailing a form
- Be honored without verification when the ADMT uses profiling for behavioral advertising
- Provide a way for consumers to confirm the business processed their request
- Be specific to ADMT requests, meaning they cannot rely on cookie banners or cookie controls
Similar to the data subject “Right to Know,” any access request a consumer sends must be addressed within 45 days.
Although consumers can opt-out of certain ADMT without verifying their identities, they must verify their identities when using the right to access.
Using “reasonable security measures when transmitting the requested information to the consumer,” businesses must provide people who have verified their identity with plain language explanations covering:
- The explicit purpose of the ADMT
- The output of the ADMT with respect to the consumer
- The decision made regarding the consumer
- Any other factors the business used to make the decision
- Human involvement in the business’s use of ADMT
- If the business’s use of ADMT has undergone evaluation for validity, reliability, and fairness, and any evaluation’s conclusion
- “A simple and easy-to-use method by which the consumer can obtain the range of possible outputs, which may include aggregate output statistics (for example, the five most common outputs of the automated decisionmaking [sic] technology, on average, across all consumers during the preceding calendar year”
- Clear instructions for how the consumer can exercise their CCPA rights
- How the consumer can submit an official complaint over the business’s use of ADMT
As you can see, this information overlaps significantly with the pre-use notice requirements. Whereas pre-use notices cover the bare minimum, the information provided to consumers exercising their right to access will likely be extensive.
The amount of information listed in this draft far exceeds the information Data Subject Access Requests under the CCPA require, which could end up triggering pushback from the business community.
Overall, the proposal goes deep and sets the bar high for how businesses must handle automated decision making technology. Much of the impetus seems to be on knowledge and informing consumers about the use of this technology and how they can protect themselves from it, but considering that many still are unaware of the data rights enshrined in 2018’s CCPA, there is no guarantee the average consumer will derive any benefit from this.
That is, of course, a cynical takeaway, but the public adoption and embrace of data protection regulation has progressed slowly despite the near universal public support for guardrails against data misuse.
The CPPA is trying to stay ahead of the AI issue and this proposal is a solid place to start, but no matter how its final version looks, the United States needs an urgent awareness campaign highlighting these rights and that yes, regulation is addressing long-held consumer grievances.