Skip to Main Content

A confusing patchwork of rules governs the privacy and security of health information in the United States. The Health Insurance Portability and Accountability Act, for example, applies only to certain “covered entities” holding “protected health information.” Student health records are generally governed by federal student privacy laws, while commercial health apps and wearable devices operate under an entirely different set of rules.

This complicated legal framework takes its toll on average Americans. As a privacy lawyer at the Center for Democracy and Technology, I constantly grapple with understanding which privacy and security standards apply or even whether certain health data are protected by law. The Department of Health and Human Services has described the situation as one where “large gaps in policies around access, security, and privacy continue, and confusion persists among both consumers and innovators.”

advertisement

The European Union has taken a different approach. It now has one regulation, the General Data Protection Regulation, that applies wholesale. The GDPR, which went into effect at the end of May, gives EU residents more control over their personal data. It also clarifies that health information is sensitive, regardless of whether it is generated by a wearable, inferred by an app or platform, or conveyed to a medical professional.

Rather than see merit in this idea, U.S. policymakers and industry players have instead criticized the GDPR on the grounds that it will somehow harm health care. The U.S. secretary of commerce recently insisted without evidence that European law will stop lifesaving drugs from coming to market. Others with a stake in the industry have suggested that the GDPR’s limits on data sharing will actually hurt people seeking medical care.

Critics of the GDPR are too quick to suggest that any access to health information absolutely demands patient consent. They worry that consent requirements will get in the way of treatment, or somehow overburden hospitals and medical researchers with paperwork. But this concern is overblown and not in line with what the GDPR actually states.

advertisement

Under the regulation, an entity must have a “legal basis” for processing health data, which does not always require explicit consent. Like HIPAA, the GDPR allows health data to be processed for medical treatment or health system management. European hospitals do not need consent to collect and use health data when necessary for basic health care, medical diagnosis, health system management, broader public health, or to ensure the safety and quality of both health care and medical products and devices. Scientific research conducted with appropriate safeguards is another area where collecting and using biometric or health data need not require explicit consent. When done in the public interest, medical researchers argue that the GDPR will have a minimal impact on biomedical research.

None of the GDPR’s requirements should surprise any company handling health information on either side of the Atlantic. Many of its requirements are comparable to privacy and security rules found in HIPAA. In fact, HIPAA’s rules are arguably more rigorous than what is provided for in the GDPR. European law does not reinvent the privacy wheel.

Misleading broadsides against the GDPR are unfortunate because the U.S. could benefit from having a single, comprehensive health privacy law. Industry groups recognize the need for more guidance for companies. In 2015, the Consumer Technology Association put forward a set of best practices for privacy and security of personal wellness devices that HHS highlighted in a report that identified gaps in American health privacy laws. The big problem with these guidelines: No company ever adopted them.

Instead, industry groups have mobilized to fight against even modest privacy regulations. Trade associations derailed efforts by the Department of Commerce in 2015 to create a self-regulatory code of conduct around biometric facial recognition — a technology that anyone who has seen the movie “Minority Report” ought to be concerned about. Last year, a proposal to require internet service providers to get consent before monetizing browsing information related to medical conditions and health care was stopped when industry groups insisted to California lawmakers that asking permission before using some data would expose children to “pop-ups and hackers” and make the Internet vulnerable to a “terror attack.”

Regulatory efforts to protect data are easily killed. This is especially true with health information, when major medical entities from marketers to data brokers insist that easy access to more data will somehow make health care more efficient. But how many companies actually deserve that sort of trust when so few are willing to forgo any and all profitable uses of sensitive information?

Rather than reveling in the confusion that exists for Americans or attacking laws that attempt to address this problem, U.S. companies should figure out how to turn compliance with the GDPR into a way to give people confidence that sensitive health information is protected and used ethically no matter the source. EU privacy rules can be an opportunity to smooth out our patchwork of requirements.

Joseph Jerome is a policy counsel for the Center for Democracy and Technology’s Privacy and Data Project.

STAT encourages you to share your voice. We welcome your commentary, criticism, and expertise on our subscriber-only platform, STAT+ Connect

To submit a correction request, please visit our Contact Us page.