A new report helps U.S. federal agencies protect your privacy. Companies should use it, too.

Last week the U.S. National Institute for Standards and Technology (a.k.a. “NIST,” for those who aren’t up on their acronyms) published a report that contains valuable advice for companies seeking to protect the sensitive data they are collecting. NIST aims to “advanc[e] measurement science, standards, and technology in ways that enhance economic security and improve our quality of life.” Its report, An Introduction to Privacy Engineering and Risk Management in Federal Systems, is targeted to government agencies. But the guidance in the report is useful for any entity that collects information, and it should be implemented across the board.

This is a boon for the private sector. Now, more than ever, it’s important for the companies that handle our most sensitive data to start implementing meaningful mechanisms to protect it.  

The central question of the report is simple: Is it possible to approach data privacy the same way we approach data security? The authors answer that question in the affirmative, and propose a method for examining data privacy and implementing protections that attempts to be both scalable and testable.

So….what does the report teach?

For starters, when it comes to privacy, many lawyers and professionals focus only on “complying” with laws and regulations. When we talk about words that are used too much, you can think about “compliance” as the “cyber” of the privacy world. Particularly in the U.S., many work on privacy to the extent that they are required to do so, either by law (there is no blanket privacy law in the U.S. but there are laws in certain sectors), or regulation (particularly the Federal Trade Commission’s authority on “unfair or deceptive” trade practices). The NIST report pushes government agencies to get past the “compliance” mindset, and instead dig into the real privacy threats individuals face with regard to the processing of their personal information.

The report helps develop a common lexicon for people to discuss privacy threats and opportunities in an engineering frame, so they can understand the threats and can implement changes to protect privacy throughout a system’s “life cycle.” This is necessary for organizations to fully embrace the “privacy by design” principle, because it allows product designers, engineers, and policy people to speak the same language when they talk about what’s happening and what needs to change.

The most important piece of the report is that it specifically calls for risk assessments to focus on threats to the users. Adopting a user-centric risk approach is truly a sea change for current practice, and it comes as a happy surprise for those of us familiar with the organizational risk approach spelled out in an earlier draft of the NIST report released in 2015.

Focusing on the user seems like common sense, but the norm has been to focus exclusively on the entity collecting data, not the person whose data was being collected. This meant considering the users only by proxy, in the form of legal or reputational costs. That approach has been wholly inadequate for taking into account the wide range of threats that we face when our data are collected and processed, and the damage breaches can cause (such as the emotional impact of having our personal photos revealed to the world).

Unfortunately, the NIST report also falls short in a couple other places. First, and most importantly, it creates a strange new term: “Problematic Data Actions,” or “PDAs.” We’re meant to use the PDA concept in the privacy space the way we use “threats” in information security. But if we don’t outright call them threats, we minimize the serious implications of compromise or loss of privacy. After all, in some instances, loss of privacy can get you killed. The report does recognize these implications, citing economic loss, discrimination, and physical harm. However, these implications stem from threats, not “PDAs,” and we would do well to underscore that.

The report also fails to establish that the implications of some threats (not PDAs) in data processing can be sufficiently grave that an organization should opt not to engage in those activities. Your process for privacy engineering won’t live up to its promise without a way to limit data processing if that is necessary. Without guidance for limits, organizations are likely to fall back into the “compliance” hole described above, spending time thinking through privacy implications for users but always moving forward with a plan if it’s not unlawful — even when it’s not good for privacy.

So what happens now? Government agencies are setting out to implement the recommendations in this document. It’s also time for the private sector to step up in a big way for individual privacy. After all, new European data protection rules require this kind of approach for any U.S. company that targets European users. Every day, more of our personal information ends up in the hands of corporations, and we need to find ways to not only limit the data companies collect, but also increase the level of control we have over what they keep and process. This report is a solid road map for doing that, giving companies a pathway for tackling increasingly urgent privacy issues. We call on all of them to rise to the occasion.