New State Privacy Laws Focus on AI | Orrick, Herrington & Sutcliffe LLP

[co-author: Christina Lee]

In recent years, the use of artificial intelligence (AI) has proliferated across industries. With the widespread adoption of this technology, many business decisions are now made in consultation or even relying on AI and what is now called profiling or automated decision making. Along with the rise of AI, there has been an increase in privacy laws in states across the country.

AI depends on huge datasets which may include personal information, including sensitive personal information. Consequently, privacy laws have become a primary means of addressing the inherent risks of using AI to make decisions that have legal and social consequences such as loan approvals or credit decisions. ‘use. These laws are intended to ensure that AI uses personal information in a responsible manner that aims to give consumers control over their personal information when that information is used for automated decision-making. But they also create new obligations that organizations must assess and potentially meet. Companies employing artificial intelligence must therefore be aware of the specific requirements of the laws to which they are subject. With a handful of new national privacy laws coming into force in 2023, we’ve outlined key takeaways for companies currently using or considering deploying AI in their business.

Automated decision-making and profiling are in the crosshairs of state privacy laws

While the California Consumer Privacy Act (CCPA) is silent on automated decision-making, the California Privacy Rights Act (CPRA) (which amends the CCPA), the Colorado Privacy Act (CPA), the Virginia Consumer Data Protection Act (VCDPA ), and the Connecticut Data Privacy Act (CTDPA) all grant consumers the right to opt out of the processing of their personal information for profiling purposes and create requirements that impact automated decision-making.

Although the definitions of automated decision-making and profiling differ slightly in different national privacy laws, profiling generally refers to an organization attempting to assess the personal aspects of a data subject through the processing of personal information. Along the same lines, automated decision-making refers to an organization (i) acting on profiling to make a decision by automated means without human intervention or with limited human intervention or (ii) establishing an automated system that makes a decision based directly on information provided by a data subject (such as an age restriction that would prevent anyone under a certain age from participating in a program or applying for a position).

CCAC/ACPL

While the CCPA is silent on automated decision-making, the CPRA, which comes into effect on January 1, 2023 and modifies and expands the concepts of the CCPA, deals directly with automated decision-making. CPRA has added a new definition of “profiling,” giving consumers opt-out rights with respect to companies’ use of “automated decision-making technology,” which includes profiling consumers based on their “work performance, economic situation, health, preferences, interests, reliability, behavior, location or movements.

The CPRA does not limit language to profiling and designates the right to be defined by the California Privacy Protection Agency (CPPA). CAPP is responsible for adopting regulations “governing access and opt-out rights with respect to companies’ use of automated decision-making technology,” including the provision of meaningful information about the logic of the decision and the likely outcome as far as the consumer is concerned. Importantly, CAPP’s mandate to issue these rules is broad and not currently limited to “only” automated rulings or rulings with legal effect. To date, CAPP has not published any rules regarding automated decision making.

For more information on CPRA, see Orrick’s CPRA On The Way tool.

CPA, VCDPA and CTDPA

The VCDPA, which will also come into effect on January 1, 2023, and the CPA, which will come into force on July 1, 2023, will allow individuals to opt out of “profiling for purposes of making decisions that produce significant legal or similar effects” regarding the consumer, which is generally defined as the refusal and/or provision of financial and loan services, housing, insurance, registration or educational opportunities, criminal justice, employment opportunities, health services or access to basic necessities. The CTDPA offers an opt-out similar to Colorado and Virginia, but only for “automated decisions only.”

By way of comparison, the VCDPA’s definition of “profiling” aligns with that of the CPRA and includes an opt-out provision identical to the CPA’s opt-out provision that allows consumers to opt out of having their personal information processed for profiling purposes with a view to making decisions that produce legal or similar effects concerning the consumer.

Risk assessments

Amendments to the VCDPA, CPA, CPRA and CTDPA all require data controllers to conduct a data protection impact assessment (DPIA) for processing activities that present a “risk increased harm to a consumer”. Increased risk typically includes:

  1. processing of personal information for targeted advertising purposes;
  2. sell personal information;
  3. process sensitive data; Where
  4. process personal information for profiling purposes where it involves a foreseeable risk of the following:
    1. unfair or deceptive treatment or disparate unlawful impact on consumers;
    2. financial, physical or reputational harm to the consumer;
    3. intrusion into solitude or isolation, or into consumers’ private affairs; Where
    4. other substantial harm to the consumer.

These DPIAs must identify and weigh the risks and benefits of the processing for consumers, the controller, other stakeholders and the general public that may arise from the processing, mitigated by the safeguards used to reduce those risks. They are not intended to be made public or provided to consumers. Instead, DPIAs must be made available to the state attorney general upon request, pursuant to a civilian request for investigation. If companies identify an increased risk from any processing of personal information performed by AI, they will now be required to conduct DPIAs.

Not all state privacy laws target automated decision-making or profiling

The Nevada Privacy Law (NPL) is silent on the subject of automated decision-making and profiling and the Utah Consumer Privacy Act (UCPA), which goes into effect December 31, 2023, does not give consumers the right to opt out. . profiling and does not require companies to positively assess data processing that presents “an increased risk of harm”, such as the use of sensitive data and profiling.

What happens afterwards?

If your business uses AI with underlying data that includes personal information, you should carefully evaluate how you collect and use personal information and sensitive information and ensure that you comply with the various requirements of national data protection laws. confidentiality. For more information on requirements under specific state privacy laws, see Orrick’s Insight on US State Consumer Privacy Guide.

For more information on best practices for building your AI compliance program, check out Orrick’s Insight on AI Tips: 10 Steps to Future-Proof Your AI Regulatory Strategy.

Stay tuned for updates as state privacy laws and related regulations regarding AI systems are rolled out over the coming months.

Comments are closed.