This week the Washington state legislature reintroduced the Washington Privacy Act (WPA), which failed to pass last year. A lot of the same controversial points remain, so can it pass this year? Will 2020 be the year of proliferating state privacy legislation, especially if no federal statute materializes?
What are the contours of the WPA, and how does it compare to the existing alphabet soup of privacy regulation, the GDPR and the CCPA? The WPA closely tracks the principles of the GDPR, most importantly, with data subject rights to access, correct, export, delete, and opt-out of personal data processing. It also contains common sense privacy principles like transparency, data minimization, use specification, and security. The bill provides special protections for sensitive data, including children, defined as under the age of 13. It also borrows directly from the GDPR the expansive definition of personal data, the high bar for consent as a “clear affirmative” action that must be “ freely given, specific, informed, and unambiguous”, and governs the sharing of data through the framework of controller/processor relationships. Like the CCPA, it also provides for a consumer’s right to opt-out of the “sale” of their personal data to third-parties. Unlike the CCPA, however, “sale” is defined tersely as “the exchange of personal data for monetary or other valuable consideration”. In fact, the bill takes more care to define what does not constitute sale, e.g. sharing data with a processor. How sale is defined may turn out to be irrelevant because we’ve seen most companies adapt quickly and claim that they are not selling data, particularly in the advertising space which CCPA meant to address, leaving the field largely structurally unchanged. It’ll be interesting to see if CCPA 2.0 tackles this issue.
So far, so good. However, the two sticking points in the bill that remain from last year are that it does not include a private right of action and that it does include a section, albeit a short one, on facial recognition. Let’s take a look at each in turn.
WPA’s lack of a right to private action is a significant weakness of the bill from a consumer rights perspective. CCPA at least provides the right to sue in the event of a data breach, while the WPA does not even make mention of data breaches. While a violation of the WPA could constitute an “unfair and deceptive” practice and therefore consumers could give rise to a private right of action under WA state law the bill does state that only the state Attorney General can bring action against companies for violation of the law. Fines are limited to $7,500 per violation, but it is unclear what is considered a single violation (is it measured per individual or per type of violation). Some argue that a right to private action would be too burdensome for smaller companies. To avoid getting overwhelmed by customer lawsuits, companies can consider simply complying with the law. That takes resources, but we haven’t seen companies go bankrupt yet trying to comply with the CCPA, or for that matter the GDPR, which does give data subjects the right to “effective judicial remedy”. Finally, the WPA’s enforcement is limited to civil penalties and there is no proposal for the establishment of a Washington Data Protection Agency (like in CCPA 2.0).
The section on facial recognition is the only part of the bill that calls out a specific type of data processing technology in an otherwise technology neutral bill. It has a few important stipulations but leaves some things unclear. First, it requires processors to make their facial recognition services available for meaningful testing for “inaccuracy and unfair performance differences” by controllers and third-parties. Does this mean that controllers who sell facial recognition products directly to consumers don’t have to abide by this provision? The processor is obligated to fix the issue only if “material” differences are found, the results are disclosed “directly” to the processor (as opposed to being published publicly for example), and the processor deems the findings “valid”. Then, and only then, does the processor need to implement a “plan” to fix the problem. I think we can do better. Second, consumers must provide notice and consent for their image to be “enrolled” in a facial recognition service in all spaces open to the public. This requirement seems strong enough. While “safety and security purposes” can override the requirement for consent they have to be based on a “specific incident” for that consumer. So presumably a grocery store could not institute blanket facial recognition tracking by arguing that their store is located in a high-crime area or that the store experiences high asset loss. Third, controllers who use facial recognition services must follow common responsible AI principles of first testing the technology, assuring human review, particularly for decisions that have “significant effects” on consumers, and provide training to those operating the technology so they appropriately interpret findings. Finally, controllers must not disclose facial recognition findings to law enforcement, an important provision to prevent debacles the likes of Amazon’s Ring.
In conclusion, privacy advocates are still not satisfied with the WPA’s lack of a private right to action while civil liberties advocates believe the facial recognition protections are not strong enough preferring instead a moratorium on this technology until we can better understand its impacts. If it passes, the WPA would take effect on July 31, 2021. But Washington is not alone in taking action while Congress deliberates. Massachusetts, Minnesota, Pennsylvania, New Jersey, and New York all have bills on the docket for 2020. Other states, like Connecticut, Hawaii and Louisiana, are studying the issue. While many state privacy bills, most notably Washington’s and New York’s failed in 2019, perhaps 2020 will be a more auspicious year for privacy.