Trade Law Daily is a Warren News publication.
CPRA Pre-Rulemaking

Calif. Privacy Agency Mulls Equity, Dark Pattern Protections

California's privacy agency should try to counter inequity and manipulation, said academics at California Privacy Protection Agency (CPPA) informational sessions this week (see 2203290062). The CPPA board held the second day Wednesday of a virtual hearing to gather background information for an upcoming rulemaking to implement the California Privacy Rights Act (CPRA), which is the sequel to the California Consumer Privacy Act (CCPA). Across the country, the Maryland House Economic Affairs Committee weighed a bill Wednesday to set up a privacy study group to make recommendations for comprehensive legislation next year.

Sign up for a free preview to unlock the rest of this article

Timely, relevant coverage of court proceedings and agency rulings involving tariffs, classification, valuation, origin and antidumping and countervailing duties. Each day, Trade Law Daily subscribers receive a daily headline email, in-depth PDF edition and access to all relevant documents via our trade law source document library and website.

The CPPA can promote justice and equity with its CPRA rulemaking, addressing “deeper societal problems,” said Safiya Noble, director-Center for Critical Internet Inquiry at the University of California-Los Angeles. "You have a chance to do something groundbreaking.” Tech companies’ algorithms promote profitability and can disproportionately harm historically marginalized communities, said Noble. “Since tech is not neutral,” she said, the CPPA’s rulemaking needn’t be, either.

Don’t lose sight” of policy goals, said Chris Hoofnagle, UC-Berkeley faculty director-Center for Long Term Cybersecurity. California privacy regulators are charged with protecting the value of privacy rights, but companies define that “through the lens of risk and use controls to manage that risk,” he said. “Companies are likely to treat the controls they implement as the terminal goal, rather than the policy aims” of California’s privacy law. More than simply informing people about the number of security breaches, the CPPA should seek to promote trust in digital systems, he added.

CPRA is an opportunity to rethink how consent works in user interfaces, said Jennifer King, privacy and data policy fellow at Stanford University’s Institute for Human-Centered Artificial Intelligence. King participated in a Tuesday presentation on dark patterns, which refer to user interfaces that trick users into making choices they wouldn’t have made otherwise. Under CCPA, businesses often use toggle switches on their websites to implement do-not-sell requests, but it can be "extremely unclear” to users whether clicking the switch will opt them out, she said.

Dark patterns are increasingly prevalent, and it’s important to remember not all of them are fraudulent, said University of Chicago law professor Lior Strahilevitz. To combat insidious practices, regulators could take a symmetrical approach that requires opting out to not be any more difficult than opting in, he said.

The European Data Protection Board tried to give guidance on when to perform data privacy impact assessments under the EU’s General Data Protection Regulation, said Gwendal LeGrand, the European Data Protection Supervisor's head-activity for enforcement support and coordination, in a Wednesday presentation. Publishing software to walk organizations through impact assessments can make the process more user-friendly, said LeGrand, highlighting an app developed by France’s privacy commission.

Make CPRA rules that stop insurance companies and automakers from using precise geolocation without consumer permission, said Consumer Watchdog’s Justin Kloczko during a public comment round. The group released a report Wednesday on privacy problems raised by connected cars. Klockzo told the CPPA board “we are looking forward to regulations that end this era of data monopoly.”

CPPA should be "cautious about overly expansive actions that would penalize the use of neutral and beneficial technologies in a way that undermines their many daily uses that have benefited consumers," NetChoice Policy Counsel Jennifer Huddleston said in public comments after Wednesday's presentations. Electronic Privacy Information Center counsel Ben Winters said a broad definition of automated decision-making might "be met with concerns from industry ... but that should be a burden that they can fulfill, and the risk of under-inclusive definitions is a greater one."

Maryland’s SB-11 was originally a comprehensive measure but was converted into a study bill before it passed the Senate (see 2203180022). “Maryland should engage with a strong model," said sponsor Sen. Susan Lee (D) at the livestreamed hearing. Four states now have laws. "We should use the opportunity of the work group to zero in [on] the specific protections in the respective states so we have the best bill." Maryland Assistant Attorney General Hanna Abrams said a study group will help because Marylanders broadly support privacy legislation, but working out the details is complex.