Protecting Consumer Health Data Privacy Beyond HIPAA

Jun 15, 2022 by Eric D. Reicin, President & CEO, BBB National Programs

If you look at the apps on your phone, chances are you have at least one related to your health—and probably several. Whether it is a mental health app, a fitness tracker, a connected health device, or something else, many of us are taking advantage of this technology to keep better track of our health in some shape or form. Recent research from the Organization for the Review of Care and Health Applications found that 350,000 health apps were available on the market, 90,000 of which launched in 2020 alone.

While these apps have a great deal to offer, it is not always clear how the personal information we input is collected, safeguarded, and shared online. Existing health privacy law, such as the Health Insurance Portability and Accountability Act (HIPAA), is primarily focused on the way hospitals, doctors’ offices, clinics, and insurance companies store health records online. The health information these apps and health data tracking wearables are collecting typically do not receive the same legal protections.

 

Why This Is Potentially Troubling

Without additional protections in place, companies may share (and potentially monetize) personal health information in a way consumers may not have authorized or anticipated. As Sara Morrison explains in Recode, “The app economy is based on tracking app users and making inferences about their behavior to target ads to them. ... That means health apps collect data that we consider to be our most sensitive and personal but may not protect it as well as they should.”

Take GoodRx, for example—an app that helps users save money on prescription drugs by finding price comparisons and coupons. While this app was helping millions of people save money, in early 2020 Consumer Reports found GoodRx to be sharing these personal details with tech and marketing companies. And some of that data was shared further. The company has made changes since then.

More recently, in 2021, Flo Health faced a Federal Trade Commission (FTC) investigation. The FTC alleged in a complaint that “despite express privacy claims, the company took control of users’ sensitive fertility data and shared it with third parties—a broken promise that left consumers feeling ‘outraged,’ ‘victimized’ and ‘violated.’” Flo Health and the FTC settled the matter with a Consent Order requiring the company to get app users’ express affirmative consent before sharing their health information as well as to instruct the third parties to delete the data they had obtained.

 

Current Landscape Of Health Data Protections

Section 5 of the FTC Act empowers the FTC to initiate enforcement action against unfair or deceptive acts, meaning the FTC can only act after the fact if a company’s privacy practices are misleading or cause unjustified consumer harm. While the FTC is doing what it can to ensure apps are keeping their promises to consumers around the handling of their sensitive health information, the rate at which these health apps are hitting the market demonstrates just how immense of a challenge this is.

The FTC chair is speaking out on this issue. In April, during her first public remarks on privacy issues since becoming chair last year, Lina Khan said that the agency would continue to use its existing statutory authorities and its power to police unfair and deceptive data practices to "take swift and bold action" against companies that misuse or fail to adequately secure consumers' personal information.

As to the prospects for federal legislation, commentators suggest that comprehensive federal privacy legislation seems unlikely in the short term. States have begun implementing their own solutions to shore up protections for consumer-generated health data. California has been at the forefront of state privacy efforts, first with the California Consumer Privacy Act (CCPA) of 2018, and more recently by establishing the California Privacy Rights Act (CPRA). Virginia, Colorado, and Utah have also recently passed state consumer data privacy legislation, and other states are considering legislation as well.

 

The Path Forward

Recently, my organization was selected to implement and house a self-regulatory program for the implementation of the Consumer Privacy Framework for Health Data, released by the Executives for Health Innovation (EHI) and the Center for Democracy and Technology in February 2021.

I think the most critical step for many businesses is to recognize that they are collecting health data and to become familiar with the legal potholes that exist in the landscape. These companies must have robust privacy practices in place and should always err on the side of caution.

For instance, when collecting and sharing consumer health information of any kind, carefully consider whether your privacy practices require opt-in consent. With a lack of clear guidance on certain non-HIPAA data collection and use, choosing an opt-out model may have negative downstream effects for your organization.

Be specific about the data you are collecting. For example, the Digital Advertising Alliance (DAA) Self-Regulatory Principles require opt-in consent for the collection of data regarding “pharmaceutical prescriptions” or “medical records.” (Disclosure: BBB National Programs’ Digital Advertising Accountability Program serves as an accountability agent for DAA, and we are compensated for our work.) And the FTC, in providing privacy best practices for mobile health app developers, also indicates that apps or devices collecting health data should get a user’s “affirmative express consent” before collecting or sharing that data.

In addition to proper consent procedures, companies collecting health data should ensure that their apps and devices that collect consumer health information comply with the FTC’s Health Breach Notification Rule.

While the pandemic certainly contributed to an increased reliance on technology to track personal health data, the use of digital technology to help us stay in tune with our health is not likely to slow down. With any new technology, there is always a period of assessment by the market and a watchful eye cast by regulators. By building in greater privacy protections from the outset, companies can avoid having to make changes down the road in response to future regulation.

Originally published in Forbes.

Suggested Articles

Blog

Old MacDonald Had an Engagement Farm: Lessons Learned from FTC v. NGL

Capturing user engagement is the foundation of internet commerce. And while the incentives to prompt greater engagement are certainly understandable, the recent NGL Labs case from the FTC raises important questions about the ethical and legal ramifications when companies try to artificially generate engagement among their userbase.
Read more
Blog

Independence Day Edition: CBPR Framework Offers “Checks & Balances”

Going, Going, Gone Global, a webinar on the CBPR Global Forum, delved into how privacy impacts businesses’ brand reputation and builds trust with key stakeholders, discussed the purpose of the Global CBPR, and its value to Global Forum members.
Read more
Blog

Industry Self-Regulation: Part of the Solution for Governing Generative AI

The spotlight on generative AI remains bright. The benefits and risks continue to be ever-present in the minds of business and political leaders. No matter the timing or the setting, the creation of transparency, accountability, and collaboration among stakeholders is key to successful industry self-regulation as is the importance of setting standards and best practices.
Read more
Blog

The Demise of “Chevron Deference”: Who Will Fill the Regulatory Gaps?

The Supreme Court's 1984 ruling in Chevron v. NRDC held that courts should defer to federal agencies’ interpretations of ambiguous federal laws so long as those interpretations are reasonable. So given the court’s decision to overturn it, where does that leave companies that want a level playing field and perhaps even to raise the bar, instead of racing to the bottom?
Read more