Washington State Privacy Act

The state of Washington is looking to take the lead in privacy regulations by re-introducing a bill S-4873.3, that will set a new standard for privacy in the US call the Washington State Privacy Act (WPA). At first read, it seems to be a marriage of the General Data Protection Regulation  (GDPR) and the California Consumer Privacy Act (CCPA). What does seem to be a common theme across all of the regulations are consumer rights.

Washington State Privacy Act

Washington State Privacy Act (WPA)

Here is a detailed overview of WPA from the team at the Future of Privacy Forum:


The Act would provide comprehensive data protections to Washington State residents and would apply to entities that 

1) conduct business in Washington or 

2) produce products or services targeted to Washington residents. 

  • For the Act to apply to the second category of entities, they must control or process data of at least 100,000 consumers; 
  • or derive 50% of gross revenue from the sale of personal data and process 
  • or control personal data of at least 25,000 consumers 
  • (with “consumers” defined as natural persons who are Washington residents, acting in an individual or household context). 
  • The Act would not apply to state and local governments or municipal corporations.
  • The Act would regulate companies that process “personal data,” defined broadly as “any information that is linked or reasonably linkable to an identified or identifiable natural person” (not including de-identified data or publicly available information “information that is lawfully made available from federal, state, or local government records”), with specific provisions for pseudonymous data (see below, Core consumer rights)

The Act would require companies to comply with basic individual rights to request access to their data, correct or amend that data, delete their data, and access it in portable format (“portable and, to the extent technically feasible, readily usable format that allows the consumer to transmit the data… without hindrance, where the processing is carried out by automated means”). These rights would not be permitted to be waived in contracts or terms of service, and would be subject to certain limitations (for example, retaining data for anti-fraud or security purposes). 

Along with these core rights, the Act would also grant consumers the right to explicitly opt out of the processing of their personal data for the purposes of targeted advertising, the sale of personal data, or profiling in furtherance of decisions that produce legal, or similarly significant, effects. Such effects include the denial of financial and lending services, housing, insurance, education enrollment, employment opportunities, health care services, and more. Unlike the CCPA, the Act would not prescribe specific opt out methods (like a “Do Not Sell My Information” button on websites), but instead require that opt-out methods be “clear and conspicuous.” It would also commission a government study on the development of technology, such as a browser setting, browser extension, or global device setting, for consumers to express their intent to opt out. 

For all of these individual rights, companies are required to take action free of charge, up to twice per year, within 45-90 days (except in cases where requests cannot be authenticated or are “manifestly unfounded or excessive”). Importantly, the law would also require that companies establish a “conspicuously available” and “easy to use” internal appeals process for refusals to take action. With the consumer’s consent, the company must submit the appeal and an explanation of the outcome to the Washington Attorney General, whether any action has been taken, and a written explanation. The Attorney General must make such information publicly available on its website. When consumers make correction, deletion, or opt out requests, the Act would oblige controllers to take “reasonable steps” to notify third parties to whom they have disclosed the personal data within the preceding year.

Finally, the Act would prohibit companies from discriminating against consumers for exercising these individual rights. Such discrimination could include the denial of goods or services, charging different prices or rates for goods or services, or providing a different level of quality of goods and services.


Under the Act, companies processing “pseudonymous data” would not be required to comply with the bulk of the core individual rights (access, correction, deletion, and portability) when they are “not in a position” to identify the consumer, subject to reasonable oversight. Notably, the Act defines pseudonymous data consistently with the GDPR’s definition of pseudonymization, as “personal data that cannot be attributed to a specific natural person without the use of additional information, provided that such additional information is kept separately and is subject to appropriate technical and organizational measures to [protect against identification].” This is also consistent with the Future of Privacy Forum’s Guide to Practical Data De-Identification. Pseudonymous data is often harder to authenticate or link to individuals, and can carry lessened privacy risks. For example, unique pseudonyms are frequently used in scientific research (e.g., in a HIPAA Limited Dataset, John Doe = 5L7T LX619Z). 

In addition, companies may refuse to comply with requests to access, correct, delete, or port data if the company: (A) is not reasonably capable of associating the request with the personal data, or it would be unreasonably burdensome to associate the request with the personal data; (B) does not use the personal data to recognize or respond to the data subject, or associate the personal data with other data about the same specific consumer; and (C) does not sell personal data to any third party or otherwise voluntarily disclose the personal data to any third party other than a processor (service provider). 

Importantly, other requirements of the overall bill, including Data Protection Assessments (below), and the right to Opt Out of data processing for targeted advertising, sale, and profiling (above) would still be operational for pseudonymous data.

Finally, the Act would not apply to de-identified data, defined as “data that cannot reasonably be used to infer information about, or otherwise be linked to, an identified or identifiable natural person, or a device linked to such person,” subject to taking reasonable measures to protect against re-identification, including contractual and public commitments. This definition aligns with the FTC’s longstanding approach to de-identification. 


In a structure that parallels the GDPR, the Act distinguishes between data “controllers” and data “processors,” establishing different obligations for each. Almost all of the provisions of the Act involve obligations that adhere to a controller, defined as “natural or legal person which, alone or jointly with others, determines the purposes and means of the processing of personal data.”

Data processors, on the other hand, “natural or legal person who processes personal data on behalf of a controller,” must adhere (as service providers) to controllers’ instructions and help them meet their obligations. Notwithstanding controller instructions, processors must maintain security procedures that take into account the context in which personal data is processed; ensure that individual processors understand their duty of confidentiality, and may only engage a subcontractor once the controller has had the chance to object. At the request of the controller, processors must delete or return personal data. Processors must also aid in the creation of data protection assessments.


The Act would require companies to provide a Privacy Policy to consumers that is “reasonably accessible, clear, and meaningful,” including making the following disclosures:


  • (i) the categories of personal data processed by the controller; 
  • (ii) the purposes for which the categories of personal data are processed; 
  • (iii) how and where consumers may exercise their rights; 
  • (iv) the categories of personal data that the controller shares with third parties; and 
  • (v) the categories of third parties with whom the controller shares personal data. 

Additionally, if a controller sells personal data to third parties or processes data for certain purposes (i.e. targeted advertising), they would be required to clearly and conspicuously disclose such processing, as well as how consumers may exercise their right to opt out of such processing. 


Companies would be required under the Act to conduct confidential Data Protection Assessments for all processing activities involving personal data, and again any time there are processing changes that materially increase risks to consumers. In contrast, the GDPR requires Data Protection Impact Assessments only when profiling leads to automated decision-making having a legal or significant effect upon an individual (such as credit approval), when profiling is used for evaluation or scoring based on aspects concerning an individual’s economic situation, health, personal preferences or interests, reliability or behavior, location or movements, or when it is conducted at large-scale on datasets containing sensitive personal data.

Under the WPA, in weighing benefits against the risks, controllers must take into account factors such as reasonable consumer expectations, whether data is deidentified, the context of the processing, and the relationship between the controller and the consumer. If the potential risks of privacy harm to consumers are substantial and outweigh other interests, then the controller would only be able to engage in processing with the affirmative consent of the consumer (unless another exemption applies, such as anti-fraud measures and research). 


Companies must obtain affirmative, opt-in consent to process any “sensitive” personal data, defined as personal data revealing:

  • racial or ethnic origin, religious beliefs, mental or physical health condition or diagnosis, sexual orientation, or citizenship or immigration status; 
  • genetic or biometric data for the purpose of uniquely identifying a natural person; 
  • personal data from a known child; or 
  • specific geolocation data (defined as “information that directly identifies the specific location of a natural person with the precision and accuracy below 1750 ft.”)

Although the Act requires consent to process data from a “known child,” an undefined term, it notably also exempts data covered by the Family Educational Rights and Privacy Act (FERPA) and entities that are compliant with the Children’s Online Privacy Protection Act (COPPA). The Act defines a child as a natural person under age thirteen, meaning it does not follow the approach of CCPA and other bills around the country that extend child privacy protections to teenagers. 


In addition to consumer controls and individual rights, the Act would create additional obligations on companies that align with the GDPR:

  • Data Minimization & Purpose Specification – Controller’s collection of personal data must be “adequate, relevant, and limited” to what is necessary in relation to the specified and express purposes for which they are processed.
  • Reasonable Security – Appropriate to the volume and nature of the personal data at issue, controllers must establish, implement, and maintain reasonable administrative, technical, and physical data security practices to protect the confidentiality, integrity, and accessibility of personal data. 
  • Use Limitations – The Act would also create a duty to avoid secondary uses of data, absent consent, unless that processing is necessary or compatible with the specified or express purposes for which the data was initially gathered.

The obligations imposed by the Act would not restrict processing personal data for a number of specified purposes. Those exemptions include cooperating with law enforcement agencies, performing contracts, providing requested products or services to consumers, processing personal data for research, consumer protection purposes, and more. If processing falls within an enumerated exception, that processing must be “necessary, reasonable, and proportionate” in relation to a specified purpose. Controllers and processors are also not restricted from collecting, using, or retaining data for specific purposes such as conducting internal product research, improving product and service functionality, or performing internal operations reasonably aligned with consumer expectations. 


The Act would not grant consumers a private right of action. Instead, it would give the Attorney General exclusive authority to enforce the Act. The Act would cap civil penalties for controllers and processors in violation of the Act at $7,500 per violation. A “Consumer Privacy Account,” in the state treasury, would contain funds received from the imposition of civil penalties. Those funds would be used for the sole purpose of the office of privacy and data protection. The Attorney General would also be tasked with compiling a report evaluating the effectiveness of enforcement actions, and any recommendations for changes. 


In addition to its baseline requirements, the Act contains provisions specifically regulating commercial uses of facial recognition. The Act would require affirmative, opt in consent as a default requirement, and place heightened obligations on both controllers and processors of commercial facial recognition services, particularly with respect to accuracy and auditing, with a focus on preventing unfair performance impacts. A limited exception is provided for using this technology for uses such as to track the unique number of users in a space, when data is not maintained for more than 48 hours and users are not explicitly identified.

Source: https://fpf.org/2020/01/13/its-raining-privacy-bills-an-overview-of-the-washington-state-privacy-act-and-other-introduced-bills/

As these regulations are finalized companies will need support to execute on the consumer rights outlined in the WP and that is where RIVN can help.

Contemporary organizations are searching for a module-based solution such as RIVN to step up to meet this need with an easy to use SaaS-based single function that allows brands worldwide to meet business needs and be ready for what is next.

To learn more about regulations mentioned above please see the following links below:

3 Initial Insights on CCPA

The latest high profile consumer privacy regulation called the California Consumer Privacy Act (CCPA) went into effect on January 1, 2020. As a result, social media and various publications have been buzzing about the new regulation including users’ experiences.

As background CCPA is trying to give consumers more control over their data including how companies can manage it including selling data. That includes allowing consumers to request access or deletion of their data from companies. Along with expressing if they would like companies to not sell their data.

Under the new law companies that need to meet CCPA regulations include the following: (1) generate $25 million in revenue, (2) have more than 50,000 consumer records in your database, or (3) derive more than 50% of your revenue from selling consumers’ personal info.

After reading these insights I believe everyone can agree the CCPA has empowered the people to take control of how companies capture, store and manage their data.


So, here are 3 initial insights after one full week of CCPA:


CCPA Is Huge On Social Media

CCPA is having a larger social impact than anticipated. While many companies seem to be prepared for CCPA, it does seem like a lot of companies are either not prepared or are taking that stance of none compliance. The most surprising impact of CCPA has been the groundswell of regular people fully documenting their experiences with various brands in regards to CCPA. 

California citizens on their own are creating repositories to make it easy for others to submit data access & deletion requests such as this one here

Also, individuals are documenting how huge companies such as Facebook or OpenTable are simply denying consumer requests for access or deletion of their data. for now, as seen below.

Here is an example of OpenTable denying a do not sell request from one of the co-authors of CCPA Mary Stone Ross also on Twitter @MarySRoss18:

Here is an example of a Twitter user @ampersand_ie reporting back on Facebook denying deletion requests under CCPA:

CCPA is very different than the General Data Protection Regulation (GDPR)

In contrast to GDPR, CCPA has been very visible across the web. While GDPR was highly visible with the privacy community and in Europe, it has heavily focused on consent. While CCPA does have a consent component it is highly focused on consumer data access and deletion rights along with the sale of consumer data.

So for many consumers, they have seen the impact of CCPA directly in communication with them. Even more specifically in many people’s inboxes. You may have noticed emails from several of the companies that you subscribe to recently sending email updates about their privacy policy changes. 

These are directly associated with the anticipation of CCPA. While enforcement of CCPA does not occur until July 1, 2020, responsible companies are preparing now. This will continue to rise along with the use of a “Do Not Sell” button which should become a staple on most sites.

Here is an example from Potterybarn Kids:

The cost of CCPA will be great and teams will need to work closely together

The total cost of any regulation for companies is difficult to estimate. But, a recent article from Bloomberg estimated that CCPA alone will cost companies 55 Billion dollars. 

At these levels companies, internal teams will need to work in harmony. With executive oversight, the teams that have been affected by CCPA have been marketing, IT, legal and finance. 

  • Marketing Teams – These teams have been responsible for creating messaging to ensure consumers about compliance changes that have been seen in banner ads and emails.
  • IT Teams – These teams have been required to audit technology stacks and implement new compliance solutions where required.
  • Legal Teams – Most legal teams have been required to get up to speed on marketing and analytics processes to ensure corporate compliance.
  • Finance Teams – Financial organizations have been required to take new liabilities into account and allocate resources to ensure corporate compliance.

In summary, the last week has been very interesting. Over the next several weeks and months companies will need to be vigilant & flexible to ensure they are not only meeting the new regulation but also meeting social expectations.  Very exciting times!

To learn how RIVN can help please visit www.rivn.com

Consumer Online Privacy Rights Act (COPRA)

Consumers across the globe have heard new acronyms regarding privacy. From the General Data Protection Regulation (GDPR) to California Consumer Protection Act (CCPA). All of these regulations have a simple goal; create trust between brands and consumers. However, most of the regulations have been developed in silo’s at the state or regional level only. 

The latest US legislation is known as the Consumer Online Privacy Rights Act (COPRA). COPRA is designed to “provide consumers with foundational data privacy rights, create strong oversight mechanisms, and establish meaningful enforcement,” laudable goals and ones on which privacy advocates, consumers and industry are increasingly finding common ground as states around the countries craft disparate rules on privacy protection.

The quote above comes from a recent IAPP article entitled “US Senators Unveil New Federal Privacy Legislation” which can be found here. The article also highlights the penalty level included in COPRA which is between $100-1,000 per infraction per day.

One novelty or twist that COPRA brings to the table is the bill tackles algorithmic decision-making, requiring those engaged in the practice to facilitate advertising or eligibility determinations for housing, education, employment or credit to conduct an impact assessment annually for accuracy, fairness, bias and discrimination. Challenges related to “deep fakes” are also addressed.

Below is an excerpt from the article that highlights the six pillars of COPRA:

Consent: The bill requires individual consent for data processing, including express affirmative consent for processing sensitive data, which is very broadly defined but excludes “publicly-available information.” Much like the California Consumer Privacy Act, COPRA provides individuals the right to opt out of the transfer of their covered data for “valuable consideration” and would grant the FTC rulemaking in that area.

  1. Access: The act requires covered entities to provide individuals with their own covered data upon request, in a portable format, as well as the name of any third party to which it has been transferred for valuable consideration.
  2. Correction and deletion: Individuals are granted the right to correct and delete their own covered data.
  3. Transparency: Covered entities must publish a privacy policy that includes information commonly seen in such policies today. This includes contact information for the entity, the categories of data processed, and the categories of third parties and service providers to which information is transferred. Somewhat more novel requirements include retention timelines, and perhaps more contentious, the identity of each third party to which covered data is transferred. The policy must be made available in all languages in which the covered entity does business.
  4. Data minimization: Covered entities may only process covered data for specific purposes, subject to necessity and proportionality standards.
  5. Data security: Covered entities must provide reasonable security, assess vulnerabilities, implement corrective action when risks are identified and dispose of data that is no longer needed.

As noted in the third bullet point above consumer deletion request will continue to be a key part in almost all new privacy regulations. For digital marketers, finance, IT professionals and legal teams the greatest hurdle may be consent management and the “Right to Erasure/Deletion” itself. 

Contemporary organizations are searching for a module based solutions such as RIVN to step up to meet this need with an easy to use SaaS based single function that allows brands worldwide to meet business needs and be ready for what is next.

To learn more about regulations mentioned above please see the following links below:

The Power of Privacy

If you search the term “privacy” in Google you would see about 19 billions results in half a of second. The reality is people across the globe are interested in learning more about privacy and how they can protect themselves. More importantly the power of consumer privacy is on the rise. Recent news stories have highlighted the power of privacy in regards to big technology companies and political campaigns. 

Pew Research recently reported that “roughly six-in-ten U.S. adults say they do not think it is possible to go through daily life without having data collected about them by companies or the government.”

The team at Forbes recently published a great article found here. The article gives great background into how the rise of privacy has become a mainstream topic along with certain business vertices and technologies that are the most impacted.


Below is an excerpt from the article:

Andrew Hawn, my former colleague and now founder of MetaForesight, is a technology, media and content expert. Andrew has been collaborating with my analytic startup, Metametrix, and we recently spoke about privacy and its far-reaching implications.

“We’re seeing a social shift in the long term effects of privacy…. As billions more in venture investing targets our personal data for resale in a multitude of ways, people are starting to more deeply question their growing lack of data privacy and control.”

Andrew went on to say:

“The truth is that there is only so much regular citizens can do without laws and policies that empower citizens to retake some personal data power. The EU’s GDPR was a blunt first instrument, and now California’s CCPA is trying to take a slightly smarter approach starting in 2020.”

“Just trying to turn things off by playing whack-a-mole won’t work; we need new innovations focused on protections that are more conversation driven and transparent.”

What these comments do give insight to is new technologies such as RIVN need to be in place so companies have the ability to respect user privacy preferences. All companies will need to adopt scalable technologies that lead to compliance.  

For digital marketers, finance, IT professionals and legal teams the greatest hurdle may be consent management and the “Right to Erasure/Deletion” itself. Contemporary organizations are searching for a module based solutions such as RIVN to step up to meet this need with an easy to use SaaS based single function that allows brands worldwide to meet business needs and be ready for what is next.

To learn more about regulations mentioned above please see the following links below:


Customer Consent…A Slippery Slope

This year companies large and small dealt with managing the ever changing world of consumer consent. Facebook was fined $5 Billion1, Equifax was fined $700 Million2, and British Airways fined $230M3 all for mismanagement of consumer data. These fines have led to the realization that consent and data management can be a slippery slope.

As January 1, 2020 approaches companies are preparing themselves for the newest privacy regulation to go into place, the California Consumer Protection Act (CCPA). A huge part of CCPA is offering consumers the freedom to set preferences about which marketing companies would be able to advertise to them. And the ability to control what data can be sold to other advertisers.

A fantastic recent article from AdExchanger found here asks tough questions and highlights concepts that all companies need to be prepared for as CCPA goes into place. A common theme is how companies will be able to effectively communicate consumer preferences to their marketing vendors


As the world watches the California attorney general has made updates to how the state views CCPA. Below is an excerpt from the article:

“New in the draft regulations is a requirement that businesses that collect a California consumer’s personal information online must treat signals from “user-enabled privacy controls,” indicating consumers don’t want their personal information to be sold, as opt-out requests, otherwise known as “Do Not Sell” requests. These signals could come from a browser plug-in, privacy setting or any other user-enabled mechanism” 

In the latest update from the state attorney general explaining his comments further were:

“This is a completely new requirement absent from the statute itself. The California attorney general’s office explained in its Initial Statement of Reasons that this new addition was “intended to support innovation for privacy services that facilitate the exercise of consumer rights in furtherance of the purposes of the CCPA.” It said this was “necessary because, without it, businesses are likely to reject or ignore consumer tools.”

What this comment does give insight to is state is expecting new technologies such as RIVN to be in place so companies have the ability to respect user privacy preferences. All companies will need to adopt scalable technologies that lead to compliance.  

For digital marketers, finance, IT professionals and legal teams the greatest hurdle may be consent management and the “Right to Erasure/Deletion” itself. Contemporary organizations are searching for a module based solutions such as RIVN to step up to meet this need with an easy to use SaaS based single function that allows brands worldwide to meet business needs and be ready for what is next.

To learn more about regulations mentioned above please see the following links below: