Data Privacy Day – 3 Data Privacy Facts

Data Privacy Day is here! Why is this day so exciting?  Companies are finally realizing the power of privacy can have several benefits including the following:

  • Building brand trust
  • Driving a positive ROI
  • Staying ahead of the global privacy trend.

Because of these key factors, companies large and small are making privacy a major focus in 2020. Companies are implementing new policies, procedures, and technologies to ensure compliance and to stay competitive in this global market.

Here are 3 points on why everyone should celebrate Data Privacy Day:

Privacy Drives Transparency and Increases Brand Loyalty:

Consumers are now realizing the true power of having their data collected by companies across the globe. Wise consumers are now deciding to interact with brands that offer transparency when it comes to customer data. And the ones that offer exceptional experiences by leveraging that same information.

A recent Salesforce report found here suggests that 76% of consumers expect companies to understand their needs and provide a custom experience. That only happens when consumers are willing to share their data with these companies. In fact, according to the same report, 92% of consumers are more likely to trust these companies when they are transparent about the purpose of capturing data.

In short, the companies that are transparent about capturing data will reap the benefit of brand trust if they use that data responsibility and offer a personalized experience.

Data Privacy Drives ROI:

Some companies fall into the trap that privacy is only a cost center. These same companies see new regulations as just another hoop to jump through instead of an opportunity to improve systems, increase customer satisfaction and in turn increase ROI.

Cisco recently published a new data privacy report focusing on hard ROI numbers which can be found here. What is really interesting about this report is for the first time it puts a hard ROI number around privacy accountability. The reporting found on average companies will realize a $2.70 in brand benefit for every $1 spent on privacy and that numbers goes up for larger organizations. The benefits can be found in top-line revenue, increase brand trust, a reduction in data breaches and a reduction in sales cycles just to name a few areas.

Again, this report highlights a real ROI based on thousands of responses when companies make privacy a priority in their organization.

Data Privacy is Growing Across the Globe:

For years the web has been a bit of the wild west in terms of companies collect data consumers and then buying and selling that data everywhere. Now across the globe legal bodies trying to develop laws that encourage business but protect consumer rights.

As you can see in the map below from the World Federation of Advertising new regulations are being rolled out in every part of the world. That means companies can not simply hide and operate in a business as usual mentality. They are all now required to ensure the policy standards meet these global regulations. 

The key takeaway here is no matter what industry you are in or where you are located having strong privacy accountable is the wave of the future and the time to get right is now.


In summary, celebrating privacy is important, because it will shape how business gets done in the future and is a reflection on our society. We all need to embrace privacy and trust and next year we will have a bigger celebration!


Salesforce State of the Consumer Report:

Cisco Privacy Report:

World Federation of Advertisers:


Consumer Rights under the California Consumer Privacy Act (CCPA)

It can be argued the General Data Protection Regulation (GDPR) put the idea of consumer rights on the map regarding privacy. The California Consumer Protection Act (CCPA) which passed in 2018 went into effect on January 1, 2020. It is one of the lastest laws in a string of new privacy regulations that are sweeping the globe. To learn more about CCPA please see a recent article from RIVN that offers some great details here.

For this article, we focus on the rights that are protected under CCPA. But, the really interesting part about these rights is new privacy acts that are popping up in Nebraska, Florida, Washington and are all following this same playbook. Therefore, it is critically important for companies to establish a process to ensure their consumers have the availability to exercises these rights and more important to be able to react to them when necessary.


Here are the rights that will be enforced by CCPA.

  • The right to delete personal information held by businesses and by extension, a business’s service provider;
  • The right to know what personal information is collected, used, shared or sold, both as to the categories and specific pieces of personal information;
  • The right to opt-out of the sale of personal information. Consumers are able to direct a business that sells personal information to stop selling that information. Children under the age of 16 must provide opt-in consent, with a parent or guardian consenting for children under 13.
  • The right to non-discrimination in terms of price or service when a consumer exercises a privacy right under CCPA.

As noted in the first bullet point above consumer deletion requests will continue to be a key part of almost all new privacy regulations. For digital marketers, finance, IT professionals and legal teams the greatest hurdle may be consent management and the “Right to Erasure/Deletion” itself. 

Contemporary organizations are searching for a module-based solution such as RIVN to step up to meet this need with an easy to use SaaS-based single function that allows brands worldwide to meet business needs and be ready for what is next.

To learn more about regulations mentioned above please see the following links below:

CCPA Recognition of PII and IP Address

The California Consumer Protection Act (CCPA) which passed in 2018 went into effect on January 1, 2020.  The law is one of the lastest in a string of new privacy regulations that are sweeping the globe. To learn more about CCPA please see a recent article from RIVN that offers some great details here.

For this article, we wanted to focus on the evolution of what is considered Personal Identifiable Information (PII). In particular for CCPA highlighting that even data such as IP address is considered PII.

Understanding CCPA including PII and IP address

Understanding CCPA including PII and IP address

Below is a quick list of data points considered PII under CCPA:

  • IP address
  • Email address
  • Online handle
  • Biometric information
  • Geolocation data
  • Browsing and search history

Most people do not realize that lots of organizations us IP address for marketing and geo-targeting.  An IP address a unique string of numbers separated by periods that identifies each computer using the Internet Protocol to communicate over a network. Which basically means anyone that accesses the internet. Here is an example of an IP address

So while companies are starting to feel comfortable in the ability to allow customers to access or request their data based on an identifier such as email address. Companies that fall under the CCPA and similar laws also need to be ready to handle data that was traditionally considered anonymous such as IP address.

It will be critical for companies to define exactly what data they are collecting on their visitors. More importantly, they must have a process in place to manage that data including deletion. That is when RIVN comes into play.

Contemporary organizations are searching for a module-based solution such as RIVN to step up to meet this need with an easy to use SaaS-based single function that allows brands worldwide to meet business needs and be ready for what is next.

To learn more about regulations mentioned above please see the following links below:

Washington State Privacy Act

The state of Washington is looking to take the lead in privacy regulations by re-introducing a bill S-4873.3, that will set a new standard for privacy in the US call the Washington State Privacy Act (WPA). At first read, it seems to be a marriage of the General Data Protection Regulation  (GDPR) and the California Consumer Privacy Act (CCPA). What does seem to be a common theme across all of the regulations are consumer rights.

Washington State Privacy Act

Washington State Privacy Act (WPA)

Here is a detailed overview of WPA from the team at the Future of Privacy Forum:


The Act would provide comprehensive data protections to Washington State residents and would apply to entities that 

1) conduct business in Washington or 

2) produce products or services targeted to Washington residents. 

  • For the Act to apply to the second category of entities, they must control or process data of at least 100,000 consumers; 
  • or derive 50% of gross revenue from the sale of personal data and process 
  • or control personal data of at least 25,000 consumers 
  • (with “consumers” defined as natural persons who are Washington residents, acting in an individual or household context). 
  • The Act would not apply to state and local governments or municipal corporations.
  • The Act would regulate companies that process “personal data,” defined broadly as “any information that is linked or reasonably linkable to an identified or identifiable natural person” (not including de-identified data or publicly available information “information that is lawfully made available from federal, state, or local government records”), with specific provisions for pseudonymous data (see below, Core consumer rights)

The Act would require companies to comply with basic individual rights to request access to their data, correct or amend that data, delete their data, and access it in portable format (“portable and, to the extent technically feasible, readily usable format that allows the consumer to transmit the data… without hindrance, where the processing is carried out by automated means”). These rights would not be permitted to be waived in contracts or terms of service, and would be subject to certain limitations (for example, retaining data for anti-fraud or security purposes). 

Along with these core rights, the Act would also grant consumers the right to explicitly opt out of the processing of their personal data for the purposes of targeted advertising, the sale of personal data, or profiling in furtherance of decisions that produce legal, or similarly significant, effects. Such effects include the denial of financial and lending services, housing, insurance, education enrollment, employment opportunities, health care services, and more. Unlike the CCPA, the Act would not prescribe specific opt out methods (like a “Do Not Sell My Information” button on websites), but instead require that opt-out methods be “clear and conspicuous.” It would also commission a government study on the development of technology, such as a browser setting, browser extension, or global device setting, for consumers to express their intent to opt out. 

For all of these individual rights, companies are required to take action free of charge, up to twice per year, within 45-90 days (except in cases where requests cannot be authenticated or are “manifestly unfounded or excessive”). Importantly, the law would also require that companies establish a “conspicuously available” and “easy to use” internal appeals process for refusals to take action. With the consumer’s consent, the company must submit the appeal and an explanation of the outcome to the Washington Attorney General, whether any action has been taken, and a written explanation. The Attorney General must make such information publicly available on its website. When consumers make correction, deletion, or opt out requests, the Act would oblige controllers to take “reasonable steps” to notify third parties to whom they have disclosed the personal data within the preceding year.

Finally, the Act would prohibit companies from discriminating against consumers for exercising these individual rights. Such discrimination could include the denial of goods or services, charging different prices or rates for goods or services, or providing a different level of quality of goods and services.


Under the Act, companies processing “pseudonymous data” would not be required to comply with the bulk of the core individual rights (access, correction, deletion, and portability) when they are “not in a position” to identify the consumer, subject to reasonable oversight. Notably, the Act defines pseudonymous data consistently with the GDPR’s definition of pseudonymization, as “personal data that cannot be attributed to a specific natural person without the use of additional information, provided that such additional information is kept separately and is subject to appropriate technical and organizational measures to [protect against identification].” This is also consistent with the Future of Privacy Forum’s Guide to Practical Data De-Identification. Pseudonymous data is often harder to authenticate or link to individuals, and can carry lessened privacy risks. For example, unique pseudonyms are frequently used in scientific research (e.g., in a HIPAA Limited Dataset, John Doe = 5L7T LX619Z). 

In addition, companies may refuse to comply with requests to access, correct, delete, or port data if the company: (A) is not reasonably capable of associating the request with the personal data, or it would be unreasonably burdensome to associate the request with the personal data; (B) does not use the personal data to recognize or respond to the data subject, or associate the personal data with other data about the same specific consumer; and (C) does not sell personal data to any third party or otherwise voluntarily disclose the personal data to any third party other than a processor (service provider). 

Importantly, other requirements of the overall bill, including Data Protection Assessments (below), and the right to Opt Out of data processing for targeted advertising, sale, and profiling (above) would still be operational for pseudonymous data.

Finally, the Act would not apply to de-identified data, defined as “data that cannot reasonably be used to infer information about, or otherwise be linked to, an identified or identifiable natural person, or a device linked to such person,” subject to taking reasonable measures to protect against re-identification, including contractual and public commitments. This definition aligns with the FTC’s longstanding approach to de-identification. 


In a structure that parallels the GDPR, the Act distinguishes between data “controllers” and data “processors,” establishing different obligations for each. Almost all of the provisions of the Act involve obligations that adhere to a controller, defined as “natural or legal person which, alone or jointly with others, determines the purposes and means of the processing of personal data.”

Data processors, on the other hand, “natural or legal person who processes personal data on behalf of a controller,” must adhere (as service providers) to controllers’ instructions and help them meet their obligations. Notwithstanding controller instructions, processors must maintain security procedures that take into account the context in which personal data is processed; ensure that individual processors understand their duty of confidentiality, and may only engage a subcontractor once the controller has had the chance to object. At the request of the controller, processors must delete or return personal data. Processors must also aid in the creation of data protection assessments.


The Act would require companies to provide a Privacy Policy to consumers that is “reasonably accessible, clear, and meaningful,” including making the following disclosures:


  • (i) the categories of personal data processed by the controller; 
  • (ii) the purposes for which the categories of personal data are processed; 
  • (iii) how and where consumers may exercise their rights; 
  • (iv) the categories of personal data that the controller shares with third parties; and 
  • (v) the categories of third parties with whom the controller shares personal data. 

Additionally, if a controller sells personal data to third parties or processes data for certain purposes (i.e. targeted advertising), they would be required to clearly and conspicuously disclose such processing, as well as how consumers may exercise their right to opt out of such processing. 


Companies would be required under the Act to conduct confidential Data Protection Assessments for all processing activities involving personal data, and again any time there are processing changes that materially increase risks to consumers. In contrast, the GDPR requires Data Protection Impact Assessments only when profiling leads to automated decision-making having a legal or significant effect upon an individual (such as credit approval), when profiling is used for evaluation or scoring based on aspects concerning an individual’s economic situation, health, personal preferences or interests, reliability or behavior, location or movements, or when it is conducted at large-scale on datasets containing sensitive personal data.

Under the WPA, in weighing benefits against the risks, controllers must take into account factors such as reasonable consumer expectations, whether data is deidentified, the context of the processing, and the relationship between the controller and the consumer. If the potential risks of privacy harm to consumers are substantial and outweigh other interests, then the controller would only be able to engage in processing with the affirmative consent of the consumer (unless another exemption applies, such as anti-fraud measures and research). 


Companies must obtain affirmative, opt-in consent to process any “sensitive” personal data, defined as personal data revealing:

  • racial or ethnic origin, religious beliefs, mental or physical health condition or diagnosis, sexual orientation, or citizenship or immigration status; 
  • genetic or biometric data for the purpose of uniquely identifying a natural person; 
  • personal data from a known child; or 
  • specific geolocation data (defined as “information that directly identifies the specific location of a natural person with the precision and accuracy below 1750 ft.”)

Although the Act requires consent to process data from a “known child,” an undefined term, it notably also exempts data covered by the Family Educational Rights and Privacy Act (FERPA) and entities that are compliant with the Children’s Online Privacy Protection Act (COPPA). The Act defines a child as a natural person under age thirteen, meaning it does not follow the approach of CCPA and other bills around the country that extend child privacy protections to teenagers. 


In addition to consumer controls and individual rights, the Act would create additional obligations on companies that align with the GDPR:

  • Data Minimization & Purpose Specification – Controller’s collection of personal data must be “adequate, relevant, and limited” to what is necessary in relation to the specified and express purposes for which they are processed.
  • Reasonable Security – Appropriate to the volume and nature of the personal data at issue, controllers must establish, implement, and maintain reasonable administrative, technical, and physical data security practices to protect the confidentiality, integrity, and accessibility of personal data. 
  • Use Limitations – The Act would also create a duty to avoid secondary uses of data, absent consent, unless that processing is necessary or compatible with the specified or express purposes for which the data was initially gathered.

The obligations imposed by the Act would not restrict processing personal data for a number of specified purposes. Those exemptions include cooperating with law enforcement agencies, performing contracts, providing requested products or services to consumers, processing personal data for research, consumer protection purposes, and more. If processing falls within an enumerated exception, that processing must be “necessary, reasonable, and proportionate” in relation to a specified purpose. Controllers and processors are also not restricted from collecting, using, or retaining data for specific purposes such as conducting internal product research, improving product and service functionality, or performing internal operations reasonably aligned with consumer expectations. 


The Act would not grant consumers a private right of action. Instead, it would give the Attorney General exclusive authority to enforce the Act. The Act would cap civil penalties for controllers and processors in violation of the Act at $7,500 per violation. A “Consumer Privacy Account,” in the state treasury, would contain funds received from the imposition of civil penalties. Those funds would be used for the sole purpose of the office of privacy and data protection. The Attorney General would also be tasked with compiling a report evaluating the effectiveness of enforcement actions, and any recommendations for changes. 


In addition to its baseline requirements, the Act contains provisions specifically regulating commercial uses of facial recognition. The Act would require affirmative, opt in consent as a default requirement, and place heightened obligations on both controllers and processors of commercial facial recognition services, particularly with respect to accuracy and auditing, with a focus on preventing unfair performance impacts. A limited exception is provided for using this technology for uses such as to track the unique number of users in a space, when data is not maintained for more than 48 hours and users are not explicitly identified.


As these regulations are finalized companies will need support to execute on the consumer rights outlined in the WP and that is where RIVN can help.

Contemporary organizations are searching for a module-based solution such as RIVN to step up to meet this need with an easy to use SaaS-based single function that allows brands worldwide to meet business needs and be ready for what is next.

To learn more about regulations mentioned above please see the following links below:

3 Initial Insights on CCPA

The latest high profile consumer privacy regulation called the California Consumer Privacy Act (CCPA) went into effect on January 1, 2020. As a result, social media and various publications have been buzzing about the new regulation including users’ experiences.

As background CCPA is trying to give consumers more control over their data including how companies can manage it including selling data. That includes allowing consumers to request access or deletion of their data from companies. Along with expressing if they would like companies to not sell their data.

Under the new law companies that need to meet CCPA regulations include the following: (1) generate $25 million in revenue, (2) have more than 50,000 consumer records in your database, or (3) derive more than 50% of your revenue from selling consumers’ personal info.

After reading these insights I believe everyone can agree the CCPA has empowered the people to take control of how companies capture, store and manage their data.


So, here are 3 initial insights after one full week of CCPA:


CCPA Is Huge On Social Media

CCPA is having a larger social impact than anticipated. While many companies seem to be prepared for CCPA, it does seem like a lot of companies are either not prepared or are taking that stance of none compliance. The most surprising impact of CCPA has been the groundswell of regular people fully documenting their experiences with various brands in regards to CCPA. 

California citizens on their own are creating repositories to make it easy for others to submit data access & deletion requests such as this one here

Also, individuals are documenting how huge companies such as Facebook or OpenTable are simply denying consumer requests for access or deletion of their data. for now, as seen below.

Here is an example of OpenTable denying a do not sell request from one of the co-authors of CCPA Mary Stone Ross also on Twitter @MarySRoss18:

Here is an example of a Twitter user @ampersand_ie reporting back on Facebook denying deletion requests under CCPA:

CCPA is very different than the General Data Protection Regulation (GDPR)

In contrast to GDPR, CCPA has been very visible across the web. While GDPR was highly visible with the privacy community and in Europe, it has heavily focused on consent. While CCPA does have a consent component it is highly focused on consumer data access and deletion rights along with the sale of consumer data.

So for many consumers, they have seen the impact of CCPA directly in communication with them. Even more specifically in many people’s inboxes. You may have noticed emails from several of the companies that you subscribe to recently sending email updates about their privacy policy changes. 

These are directly associated with the anticipation of CCPA. While enforcement of CCPA does not occur until July 1, 2020, responsible companies are preparing now. This will continue to rise along with the use of a “Do Not Sell” button which should become a staple on most sites.

Here is an example from Potterybarn Kids:

The cost of CCPA will be great and teams will need to work closely together

The total cost of any regulation for companies is difficult to estimate. But, a recent article from Bloomberg estimated that CCPA alone will cost companies 55 Billion dollars. 

At these levels companies, internal teams will need to work in harmony. With executive oversight, the teams that have been affected by CCPA have been marketing, IT, legal and finance. 

  • Marketing Teams – These teams have been responsible for creating messaging to ensure consumers about compliance changes that have been seen in banner ads and emails.
  • IT Teams – These teams have been required to audit technology stacks and implement new compliance solutions where required.
  • Legal Teams – Most legal teams have been required to get up to speed on marketing and analytics processes to ensure corporate compliance.
  • Finance Teams – Financial organizations have been required to take new liabilities into account and allocate resources to ensure corporate compliance.

In summary, the last week has been very interesting. Over the next several weeks and months companies will need to be vigilant & flexible to ensure they are not only meeting the new regulation but also meeting social expectations.  Very exciting times!

To learn how RIVN can help please visit

Australia Set to Police Google and Facebook

Australia is trying to learn from other countries across the globe including Europe and the US to develop a platform that will help protect their citizens privacy while monitoring big tech companies. A recent Reuters article found here highlights Australia strategy including establishing the Australian Competition and Consumer Commission (ACCC) which would be “…the antitrust watchdog, to scrutinize how the companies used algorithms to match advertisements with viewers, giving them a stronghold on the main income generator of media operators.” 

This commission would be focused on Facebook and Google in particular.

Here is an excerpt from this article:
“lift the veil” on the closely guarded algorithms the firms use to collect and monetize users’ data, and accepted the ACCC’s “overriding conclusion that there is a need for reform”.

The article goes on further to discuss 5 existing investigations into the big tech companies with more to come.

To learn more about regulations mentioned above please see the following links below: