Data Privacy Day – 3 Data Privacy Facts

Data Privacy Day is here! Why is this day so exciting?  Companies are finally realizing the power of privacy can have several benefits including the following:

  • Building brand trust
  • Driving a positive ROI
  • Staying ahead of the global privacy trend.

Because of these key factors, companies large and small are making privacy a major focus in 2020. Companies are implementing new policies, procedures, and technologies to ensure compliance and to stay competitive in this global market.

Here are 3 points on why everyone should celebrate Data Privacy Day:

Privacy Drives Transparency and Increases Brand Loyalty:

Consumers are now realizing the true power of having their data collected by companies across the globe. Wise consumers are now deciding to interact with brands that offer transparency when it comes to customer data. And the ones that offer exceptional experiences by leveraging that same information.

A recent Salesforce report found here suggests that 76% of consumers expect companies to understand their needs and provide a custom experience. That only happens when consumers are willing to share their data with these companies. In fact, according to the same report, 92% of consumers are more likely to trust these companies when they are transparent about the purpose of capturing data.

In short, the companies that are transparent about capturing data will reap the benefit of brand trust if they use that data responsibility and offer a personalized experience.

Data Privacy Drives ROI:

Some companies fall into the trap that privacy is only a cost center. These same companies see new regulations as just another hoop to jump through instead of an opportunity to improve systems, increase customer satisfaction and in turn increase ROI.

Cisco recently published a new data privacy report focusing on hard ROI numbers which can be found here. What is really interesting about this report is for the first time it puts a hard ROI number around privacy accountability. The reporting found on average companies will realize a $2.70 in brand benefit for every $1 spent on privacy and that numbers goes up for larger organizations. The benefits can be found in top-line revenue, increase brand trust, a reduction in data breaches and a reduction in sales cycles just to name a few areas.

Again, this report highlights a real ROI based on thousands of responses when companies make privacy a priority in their organization.

Data Privacy is Growing Across the Globe:

For years the web has been a bit of the wild west in terms of companies collect data consumers and then buying and selling that data everywhere. Now across the globe legal bodies trying to develop laws that encourage business but protect consumer rights.

As you can see in the map below from the World Federation of Advertising new regulations are being rolled out in every part of the world. That means companies can not simply hide and operate in a business as usual mentality. They are all now required to ensure the policy standards meet these global regulations. 

The key takeaway here is no matter what industry you are in or where you are located having strong privacy accountable is the wave of the future and the time to get right is now.

Source: https://wfanet.org/knowledge/item/2020/01/14/WFA-Global-Privacy-Map

In summary, celebrating privacy is important, because it will shape how business gets done in the future and is a reflection on our society. We all need to embrace privacy and trust and next year we will have a bigger celebration!

Resources:

Salesforce State of the Consumer Report: https://www.salesforce.com/blog/2019/04/customer-loyalty-data-privacy-trust.html

Cisco Privacy Report: https://www.cisco.com/c/dam/en/us/products/collateral/security/2020-data-privacy-cybersecurity-series-jan-2020.pdf

World Federation of Advertisers:

https://wfanet.org/knowledge/item/2020/01/14/WFA-Global-Privacy-Map

RIVN:

www.rivn.com

Consumer Rights under the California Consumer Privacy Act (CCPA)

It can be argued the General Data Protection Regulation (GDPR) put the idea of consumer rights on the map regarding privacy. The California Consumer Protection Act (CCPA) which passed in 2018 went into effect on January 1, 2020. It is one of the lastest laws in a string of new privacy regulations that are sweeping the globe. To learn more about CCPA please see a recent article from RIVN that offers some great details here.

For this article, we focus on the rights that are protected under CCPA. But, the really interesting part about these rights is new privacy acts that are popping up in Nebraska, Florida, Washington and are all following this same playbook. Therefore, it is critically important for companies to establish a process to ensure their consumers have the availability to exercises these rights and more important to be able to react to them when necessary.

CCPA

Here are the rights that will be enforced by CCPA.

  • The right to delete personal information held by businesses and by extension, a business’s service provider;
  • The right to know what personal information is collected, used, shared or sold, both as to the categories and specific pieces of personal information;
  • The right to opt-out of the sale of personal information. Consumers are able to direct a business that sells personal information to stop selling that information. Children under the age of 16 must provide opt-in consent, with a parent or guardian consenting for children under 13.
  • The right to non-discrimination in terms of price or service when a consumer exercises a privacy right under CCPA.

As noted in the first bullet point above consumer deletion requests will continue to be a key part of almost all new privacy regulations. For digital marketers, finance, IT professionals and legal teams the greatest hurdle may be consent management and the “Right to Erasure/Deletion” itself. 

Contemporary organizations are searching for a module-based solution such as RIVN to step up to meet this need with an easy to use SaaS-based single function that allows brands worldwide to meet business needs and be ready for what is next.

To learn more about regulations mentioned above please see the following links below:

Insights on Google Sandbox from RIVN

Google a subsidiary of Alphabet is one of the greatest company success stories of all time. While the company has diversified into various sectors including self-driving cars, maps, televisions just to name a few. Its advertising business has always been to the financial bloodline of the company.

Google recently announced the concept of a Privacy Sandbox which they would manage. Ad Exchanger recently covered this announcement and the full article can be found here (https://adexchanger.com/privacy/whats-in-googles-privacy-sandbox-nothing-for-now/). The major tag line is Google is planning to phase out 3rd party cookies by 2022. Which puts billions of marketing dollars potentially at risk. At a high level, the Privacy Sandbox would be a collection of web browser-based API versus the cookie code snippets we have today. 

So, here are a few insights:

Google is the major player in digital spend

Google is a major play in the digital spend market. In fact, based on a recent report the company almost controls 40% of the digital ad spending of the marketing by themselves as noted below. With such a controlled hand on the marketing place already it begs the question could Google be creating a monopoly?

Source: https://marketingland.com/almost-70-of-digital-ad-spending-going-to-google-facebook-amazon-says-analyst-firm-262565

Can privacy lead to a monopoly

Google is standing firm that is is not trying to restrict competition rather create a level platform that allows for developers across the globe to participate in the creation of the sandbox and support the creation of the browser API’s.  Which can be found SandBox Github.

Google Sandbox

So, it does seem that Google is making an effort to ensure that across the board fingerprinting is removed and privacy access is equal. But will the fox be able to watch the hen house?  Only time will tell.

Measurement Doomed?

One key area outside of digital advertising they everyone is also concerned about is how will conversion and other traditional tracking be measured. Right now Google is focused on testing and says it has already completed a few tests with clients calling the Google API to receive a specific value to determine conversion event, rather than relying on cookies. It will be interesting to see how historical reporting could be affecting by this completely new methodology.

In summary, it is very early still in this process but a universal standard for tracking and regulatory requirements will help move this initiative or any other forward. We need our legal bodies to act to create a framework that allows for commerce but still protects user privacy. But, everyone can agree, privacy regulations and standards will be shaping how businesses operate in this new decade.

CCPA Recognition of PII and IP Address

The California Consumer Protection Act (CCPA) which passed in 2018 went into effect on January 1, 2020.  The law is one of the lastest in a string of new privacy regulations that are sweeping the globe. To learn more about CCPA please see a recent article from RIVN that offers some great details here.

For this article, we wanted to focus on the evolution of what is considered Personal Identifiable Information (PII). In particular for CCPA highlighting that even data such as IP address is considered PII.

Understanding CCPA including PII and IP address

Understanding CCPA including PII and IP address

Below is a quick list of data points considered PII under CCPA:

  • IP address
  • Email address
  • Online handle
  • Biometric information
  • Geolocation data
  • Browsing and search history

Most people do not realize that lots of organizations us IP address for marketing and geo-targeting.  An IP address a unique string of numbers separated by periods that identifies each computer using the Internet Protocol to communicate over a network. Which basically means anyone that accesses the internet. Here is an example of an IP address 216.3.128.12.

So while companies are starting to feel comfortable in the ability to allow customers to access or request their data based on an identifier such as email address. Companies that fall under the CCPA and similar laws also need to be ready to handle data that was traditionally considered anonymous such as IP address.

It will be critical for companies to define exactly what data they are collecting on their visitors. More importantly, they must have a process in place to manage that data including deletion. That is when RIVN comes into play.

Contemporary organizations are searching for a module-based solution such as RIVN to step up to meet this need with an easy to use SaaS-based single function that allows brands worldwide to meet business needs and be ready for what is next.

To learn more about regulations mentioned above please see the following links below:

Washington State Privacy Act

The state of Washington is looking to take the lead in privacy regulations by re-introducing a bill S-4873.3, that will set a new standard for privacy in the US call the Washington State Privacy Act (WPA). At first read, it seems to be a marriage of the General Data Protection Regulation  (GDPR) and the California Consumer Privacy Act (CCPA). What does seem to be a common theme across all of the regulations are consumer rights.

Washington State Privacy Act

Washington State Privacy Act (WPA)

Here is a detailed overview of WPA from the team at the Future of Privacy Forum:

1. JURISDICTIONAL AND MATERIAL SCOPE

The Act would provide comprehensive data protections to Washington State residents and would apply to entities that 

1) conduct business in Washington or 

2) produce products or services targeted to Washington residents. 

  • For the Act to apply to the second category of entities, they must control or process data of at least 100,000 consumers; 
  • or derive 50% of gross revenue from the sale of personal data and process 
  • or control personal data of at least 25,000 consumers 
  • (with “consumers” defined as natural persons who are Washington residents, acting in an individual or household context). 
  • The Act would not apply to state and local governments or municipal corporations.
  • The Act would regulate companies that process “personal data,” defined broadly as “any information that is linked or reasonably linkable to an identified or identifiable natural person” (not including de-identified data or publicly available information “information that is lawfully made available from federal, state, or local government records”), with specific provisions for pseudonymous data (see below, Core consumer rights)
2. INDIVIDUAL RIGHTS TO ACCESS, CORRECT, DELETE, PORT, AND OPT-OUT OF DATA PROCESSING

The Act would require companies to comply with basic individual rights to request access to their data, correct or amend that data, delete their data, and access it in portable format (“portable and, to the extent technically feasible, readily usable format that allows the consumer to transmit the data… without hindrance, where the processing is carried out by automated means”). These rights would not be permitted to be waived in contracts or terms of service, and would be subject to certain limitations (for example, retaining data for anti-fraud or security purposes). 

Along with these core rights, the Act would also grant consumers the right to explicitly opt out of the processing of their personal data for the purposes of targeted advertising, the sale of personal data, or profiling in furtherance of decisions that produce legal, or similarly significant, effects. Such effects include the denial of financial and lending services, housing, insurance, education enrollment, employment opportunities, health care services, and more. Unlike the CCPA, the Act would not prescribe specific opt out methods (like a “Do Not Sell My Information” button on websites), but instead require that opt-out methods be “clear and conspicuous.” It would also commission a government study on the development of technology, such as a browser setting, browser extension, or global device setting, for consumers to express their intent to opt out. 

For all of these individual rights, companies are required to take action free of charge, up to twice per year, within 45-90 days (except in cases where requests cannot be authenticated or are “manifestly unfounded or excessive”). Importantly, the law would also require that companies establish a “conspicuously available” and “easy to use” internal appeals process for refusals to take action. With the consumer’s consent, the company must submit the appeal and an explanation of the outcome to the Washington Attorney General, whether any action has been taken, and a written explanation. The Attorney General must make such information publicly available on its website. When consumers make correction, deletion, or opt out requests, the Act would oblige controllers to take “reasonable steps” to notify third parties to whom they have disclosed the personal data within the preceding year.

Finally, the Act would prohibit companies from discriminating against consumers for exercising these individual rights. Such discrimination could include the denial of goods or services, charging different prices or rates for goods or services, or providing a different level of quality of goods and services.

3. OBLIGATIONS FOR DE-IDENTIFIED AND PSEUDONYMOUS DATA

Under the Act, companies processing “pseudonymous data” would not be required to comply with the bulk of the core individual rights (access, correction, deletion, and portability) when they are “not in a position” to identify the consumer, subject to reasonable oversight. Notably, the Act defines pseudonymous data consistently with the GDPR’s definition of pseudonymization, as “personal data that cannot be attributed to a specific natural person without the use of additional information, provided that such additional information is kept separately and is subject to appropriate technical and organizational measures to [protect against identification].” This is also consistent with the Future of Privacy Forum’s Guide to Practical Data De-Identification. Pseudonymous data is often harder to authenticate or link to individuals, and can carry lessened privacy risks. For example, unique pseudonyms are frequently used in scientific research (e.g., in a HIPAA Limited Dataset, John Doe = 5L7T LX619Z). 

In addition, companies may refuse to comply with requests to access, correct, delete, or port data if the company: (A) is not reasonably capable of associating the request with the personal data, or it would be unreasonably burdensome to associate the request with the personal data; (B) does not use the personal data to recognize or respond to the data subject, or associate the personal data with other data about the same specific consumer; and (C) does not sell personal data to any third party or otherwise voluntarily disclose the personal data to any third party other than a processor (service provider). 

Importantly, other requirements of the overall bill, including Data Protection Assessments (below), and the right to Opt Out of data processing for targeted advertising, sale, and profiling (above) would still be operational for pseudonymous data.

Finally, the Act would not apply to de-identified data, defined as “data that cannot reasonably be used to infer information about, or otherwise be linked to, an identified or identifiable natural person, or a device linked to such person,” subject to taking reasonable measures to protect against re-identification, including contractual and public commitments. This definition aligns with the FTC’s longstanding approach to de-identification. 

4. OBLIGATIONS OF PROCESSORS (SERVICE PROVIDERS)

In a structure that parallels the GDPR, the Act distinguishes between data “controllers” and data “processors,” establishing different obligations for each. Almost all of the provisions of the Act involve obligations that adhere to a controller, defined as “natural or legal person which, alone or jointly with others, determines the purposes and means of the processing of personal data.”

Data processors, on the other hand, “natural or legal person who processes personal data on behalf of a controller,” must adhere (as service providers) to controllers’ instructions and help them meet their obligations. Notwithstanding controller instructions, processors must maintain security procedures that take into account the context in which personal data is processed; ensure that individual processors understand their duty of confidentiality, and may only engage a subcontractor once the controller has had the chance to object. At the request of the controller, processors must delete or return personal data. Processors must also aid in the creation of data protection assessments.

5. TRANSPARENCY (PRIVACY POLICIES)

The Act would require companies to provide a Privacy Policy to consumers that is “reasonably accessible, clear, and meaningful,” including making the following disclosures:

 

  • (i) the categories of personal data processed by the controller; 
  • (ii) the purposes for which the categories of personal data are processed; 
  • (iii) how and where consumers may exercise their rights; 
  • (iv) the categories of personal data that the controller shares with third parties; and 
  • (v) the categories of third parties with whom the controller shares personal data. 

Additionally, if a controller sells personal data to third parties or processes data for certain purposes (i.e. targeted advertising), they would be required to clearly and conspicuously disclose such processing, as well as how consumers may exercise their right to opt out of such processing. 

6. DATA PROTECTION ASSESSMENTS

Companies would be required under the Act to conduct confidential Data Protection Assessments for all processing activities involving personal data, and again any time there are processing changes that materially increase risks to consumers. In contrast, the GDPR requires Data Protection Impact Assessments only when profiling leads to automated decision-making having a legal or significant effect upon an individual (such as credit approval), when profiling is used for evaluation or scoring based on aspects concerning an individual’s economic situation, health, personal preferences or interests, reliability or behavior, location or movements, or when it is conducted at large-scale on datasets containing sensitive personal data.

Under the WPA, in weighing benefits against the risks, controllers must take into account factors such as reasonable consumer expectations, whether data is deidentified, the context of the processing, and the relationship between the controller and the consumer. If the potential risks of privacy harm to consumers are substantial and outweigh other interests, then the controller would only be able to engage in processing with the affirmative consent of the consumer (unless another exemption applies, such as anti-fraud measures and research). 

7. SENSITIVE DATA 

Companies must obtain affirmative, opt-in consent to process any “sensitive” personal data, defined as personal data revealing:

  • racial or ethnic origin, religious beliefs, mental or physical health condition or diagnosis, sexual orientation, or citizenship or immigration status; 
  • genetic or biometric data for the purpose of uniquely identifying a natural person; 
  • personal data from a known child; or 
  • specific geolocation data (defined as “information that directly identifies the specific location of a natural person with the precision and accuracy below 1750 ft.”)

Although the Act requires consent to process data from a “known child,” an undefined term, it notably also exempts data covered by the Family Educational Rights and Privacy Act (FERPA) and entities that are compliant with the Children’s Online Privacy Protection Act (COPPA). The Act defines a child as a natural person under age thirteen, meaning it does not follow the approach of CCPA and other bills around the country that extend child privacy protections to teenagers. 

8. COLLECTION, PROCESSING, AND USE LIMITATIONS

In addition to consumer controls and individual rights, the Act would create additional obligations on companies that align with the GDPR:

  • Data Minimization & Purpose Specification – Controller’s collection of personal data must be “adequate, relevant, and limited” to what is necessary in relation to the specified and express purposes for which they are processed.
  • Reasonable Security – Appropriate to the volume and nature of the personal data at issue, controllers must establish, implement, and maintain reasonable administrative, technical, and physical data security practices to protect the confidentiality, integrity, and accessibility of personal data. 
  • Use Limitations – The Act would also create a duty to avoid secondary uses of data, absent consent, unless that processing is necessary or compatible with the specified or express purposes for which the data was initially gathered.

The obligations imposed by the Act would not restrict processing personal data for a number of specified purposes. Those exemptions include cooperating with law enforcement agencies, performing contracts, providing requested products or services to consumers, processing personal data for research, consumer protection purposes, and more. If processing falls within an enumerated exception, that processing must be “necessary, reasonable, and proportionate” in relation to a specified purpose. Controllers and processors are also not restricted from collecting, using, or retaining data for specific purposes such as conducting internal product research, improving product and service functionality, or performing internal operations reasonably aligned with consumer expectations. 

9. ENFORCEMENT

The Act would not grant consumers a private right of action. Instead, it would give the Attorney General exclusive authority to enforce the Act. The Act would cap civil penalties for controllers and processors in violation of the Act at $7,500 per violation. A “Consumer Privacy Account,” in the state treasury, would contain funds received from the imposition of civil penalties. Those funds would be used for the sole purpose of the office of privacy and data protection. The Attorney General would also be tasked with compiling a report evaluating the effectiveness of enforcement actions, and any recommendations for changes. 

10. COMMERCIAL FACIAL RECOGNITION

In addition to its baseline requirements, the Act contains provisions specifically regulating commercial uses of facial recognition. The Act would require affirmative, opt in consent as a default requirement, and place heightened obligations on both controllers and processors of commercial facial recognition services, particularly with respect to accuracy and auditing, with a focus on preventing unfair performance impacts. A limited exception is provided for using this technology for uses such as to track the unique number of users in a space, when data is not maintained for more than 48 hours and users are not explicitly identified.

Source: https://fpf.org/2020/01/13/its-raining-privacy-bills-an-overview-of-the-washington-state-privacy-act-and-other-introduced-bills/

As these regulations are finalized companies will need support to execute on the consumer rights outlined in the WP and that is where RIVN can help.

Contemporary organizations are searching for a module-based solution such as RIVN to step up to meet this need with an easy to use SaaS-based single function that allows brands worldwide to meet business needs and be ready for what is next.

To learn more about regulations mentioned above please see the following links below: