Fifty Shades of Consumer Health Data: How a Risk-Based Approach Provides More Clarity

Jun 14, 2023 by Divya Sridhar, Ph.D., Director, Privacy Initiatives, BBB National Programs

As previously discussed, the FTC’s recent efforts to build up a record about the problematic activities happening in the overlapping digital consumer health and ad tech spaces is a laudable plan of action, likely to generate both positive and negative unintended consequences in the near term. 

On one hand, we know that industry will need to act in an agile manner to review the themes from the enforcement actions and the Washington My Health, My Data Act to determine the stronger protections it will need to have in place regarding the treatment, processing, sharing, and sale of consumer health data. On the flip side, in the absence of guidance and clarity about where the lines are drawn, industry faces the challenge of reading between the lines and having to extrapolate commonsense reconnaissance to determine a path forward. 

Rather than stand by idly, letting the regulatory activity set off alarm bells – leading to companies barring data processing in certain states, creating distinct consumer experiences within individual states, or halting the marketing, research, development, and product deployment in the consumer health space – industry can look to third-party accountability agents such as BBB National Programs to provide privacy compliance expertise, clarity about potential priorities and next steps, and governance support to shape industry’s plans to reconcile all of the regulatory activities in the fast-changing consumer health privacy environment. 

 

Is there a way to categorize the risk of consumer health data?

We have pulled together a brief list of routine examples of consumer health information, that, at face value may have one level of risk. But, depending on the context and the risk associated with the use of that data, and whether it is combined with other data sources and data elements or made available in the public domain, it could lend itself to differing levels of regulation and enforcement activity. 

In addition, we have categorized some “prohibited” use cases of sensitive consumer health data, which are the result of recent findings in FTC enforcement action and appropriately convey the guidelines developed by industry thought leaders of the digital advertising principles for self-regulation and their perspective on the intersection between consumer health and advertising.

Categorizing Sensitive Consumer Health Data

Prohibitions on Sensitive Consumer Health Data

The data in these diagrams should be further reviewed for its risk-based nature, the appropriate consumer consent being gathered, and any actual harm it brings to the consumer based on a variety of factors, including:

  1. The risk of the data’s use case;
  2. The primary purpose for which the data is collected and the use of the data for one or more secondary purposes (and appropriate consents obtained, particularly in the realm of third-party advertising or tracking);
  3. Whether the data is combined with other data elements in a “higher risk” category; and
  4. Context in which the data is used, including use of the data in a sensitive health-related location, such as a reproductive healthcare clinic. 

 

Other factors, such as how the data is retained and securely stored also matter and would need to be taken into consideration when determining the level of sensitivity of the data.  

The bottom line is, all things being equal, all consumer health data should not be viewed the same – there are clearly differing levels of risk associated with general purpose data use, its primary vs. secondary purpose, the risk profile or harms it generates, and more.

 

How can a risk-based framework specific to consumer health data support regulators?

Enforcement agencies have limited capacity to enforce the wide range of laws and regulations now manifesting in the patchwork of state consumer privacy laws. 

In the context of consumer health, companies need guidance that explains and refines best practices with regard to sensitive consumer health data, incorporating examples where these data elements would likely fall into scope, and contexts when, even when the data is in scope, it may present so low of a risk profile that it would be unnecessary to closely enforce the preliminary expectations set out in the Washington My Health, My Data Act, or similar laws. 

To supplement, industry accountability programs that review, reinforce, and verify these best practices can provide an early review and a potential backstop for enforcement.

The idea is not for the United States to follow in the same footsteps as the European Union and other countries, but instead to enshrine a reasonable consumer standard for privacy law and demonstrate that accountability matters. More robust data protection can be amplified by identifying the data’s risk profile, coupled with actual harms imposed, rather than hypothetical expectations of what could potentially cause or predict harm.

Concurrently, industry should obtain the appropriate levels of consent, fulfill its duty of care toward all consumer health data processed, and take reasonable steps – including incorporating impact assessments and risk mitigation techniques – to ensure the highest standard of protections for all consumer health data.

In the age of wearables, wellness apps, generative AI, the metaverse, and other quickly changing “moving targets” of the emerging tech world and its enthusiastic consumers, we must seek to build reasonable frameworks around data use based on its risk profile rather than focusing on a blanket approach that will chill innovation and limit new markets in research and development. 

Otherwise, we limit the potential to meet admirable goals, whether it is to crowdsource a cure for cancer, make consumer health applications and services accessible to special needs populations, or other innovations that are now baked into the 247-year-old pursuit of the American dream.

Similar to this, read Consumer health data: A risk-based approach to digital privacy at IAPP.org.

Suggested Articles

Blog

Old MacDonald Had an Engagement Farm: Lessons Learned from FTC v. NGL

Capturing user engagement is the foundation of internet commerce. And while the incentives to prompt greater engagement are certainly understandable, the recent NGL Labs case from the FTC raises important questions about the ethical and legal ramifications when companies try to artificially generate engagement among their userbase.
Read more
Blog

Independence Day Edition: CBPR Framework Offers “Checks & Balances”

Going, Going, Gone Global, a webinar on the CBPR Global Forum, delved into how privacy impacts businesses’ brand reputation and builds trust with key stakeholders, discussed the purpose of the Global CBPR, and its value to Global Forum members.
Read more
Blog

Industry Self-Regulation: Part of the Solution for Governing Generative AI

The spotlight on generative AI remains bright. The benefits and risks continue to be ever-present in the minds of business and political leaders. No matter the timing or the setting, the creation of transparency, accountability, and collaboration among stakeholders is key to successful industry self-regulation as is the importance of setting standards and best practices.
Read more
Blog

The Demise of “Chevron Deference”: Who Will Fill the Regulatory Gaps?

The Supreme Court's 1984 ruling in Chevron v. NRDC held that courts should defer to federal agencies’ interpretations of ambiguous federal laws so long as those interpretations are reasonable. So given the court’s decision to overturn it, where does that leave companies that want a level playing field and perhaps even to raise the bar, instead of racing to the bottom?
Read more