Over the past year, debates about what, if anything, policymakers should do to protect data privacy have become increasingly fervent. Tech scandals and legislation like the European Union’s General Data Privacy Regulation (GDPR) and the California Consumer Privacy Act (CCPA) have drawn attention from policymakers and the media, and it’s easy to get caught up in the feeling that something must be done. Still, it is important to examine the potential implications of privacy rules before crafting a response to a perceived crisis.
I recently analyzed several federal data privacy proposals from the last year and have also discussed the concerning potential impact of state data privacy proposals. These proposals are typically well-intentioned, and policymakers believe they are responding to constituent consumer needs. However, rather than rush to make policy that could change the permissionless nature of the internet, policymakers should take a step back and ask themselves a few questions.
1. What problem are we actually trying to address and does this policy address it?
Too often, the data privacy debate gets mixed together with everyone’s fears about technology or issues with “Big Tech.” Even when it comes to concerns about data privacy, it is not always clear that the proposed solutions actually address the problems or concerns or what the problems or concerns truly are. In the rush to prevent the next scandal, policies may either fail to address the actual concerns or be so broad in their definition of covered entities, data, or personal information that they could impact basically every aspect of the American economy.
For example, many of the concerns involve the security of sensitive information such as credit card numbers or medical information. The concern is not in the initial transaction of that information, but rather in the data breach or lax data security that may expose it. Addressing broader data privacy does not always solve these more specific data security problems. As I mention below, existing laws already address many of these more sensitive areas.
2. What tradeoffs would this involve for consumers, current companies, and future innovation?
Oftentimes, debates about data privacy get caught up in the broader “techlash” conversation. Many seem to see increasing data privacy requirements as a way to regulate big players like Facebook, Twitter, and Google.
But such “techlash” is not clearly felt by most Americans and many enjoy exploring a variety of online options. Requiring certain data handling practices could eliminate options that consumers currently enjoy as well as future, better options. Data affects much more of the American economy than just tech giants, and many small businesses have expressed concerns about the potential impact of laws like the CCPA. Creating a restrictive regulatory environment could prevent new competitors from launching and providing new services with different privacy options. America’s typically permissionless approach in this area is what has allowed new products that change our lives to start in dorm rooms or garages.
The GDPR provides an example of how ex ante data regulation can affect small companies and new competitors. Several small companies from to online games to certain newspapers chose to make their product unavailable in Europe rather than engage in the costly compliance or change their service offerings. When it comes to targeted advertising, large companies like Google and Facebook have been able to retain or even gain market share while the smaller players shrank.
The potential tradeoffs are not just to innovation or options but potentially include rights such as free speech. For example, a “right to be forgotten” would be unlikely to translate into American notions of free speech as inevitably it would require silencing one person or entity’s speech in favor of another’s right to have such information removed. Policymakers should carefully consider how such rights might come into friction before assuming that a right to privacy should trump other potential concerns.
3. Do existing policies address the concerns or could they be updated to do so?
"We think it's a good thing that people are more interested in using privacy controls and managing their information online," a Facebook spokesperson said in a statement. Its traffic had steadily been growing since, but it boomed in 2018 as Facebook's privacy issues blew up, said Gabriel Weinberg, CEO and founder of DuckDuckGo.
Contrary to what some might say, the United States is not without any laws governing data privacy. Much of the most sensitive personal information is already covered by specialized law. Financial information, health information, and children’s privacy are already covered by federal laws that govern data.
But it is not just federal laws that currently govern data privacy and related issues. The Federal Trade Commission (FTC) has also used its existing authority to combat unfair and deceptive trade practices to handle data privacy and data security concerns. Since its first case against Geocities in the late 1990s, the FTC has brought over 65 cases involving personal consumer data.
When it comes to concerns about data security and data breaches, states have also helped consumers make more informed decisions. Every state has a data breach notification law. While these laws vary in timeline and what information is covered, existing law in every state provides a framework for handling such concerns.
4. What public entities are best suited to respond to these concerns?
States, Congress, and administrative agencies like the FTC are all grappling with what they should do to address concerns about data privacy. While there’s immense pressure to “do something,” policymakers should also consider that their part of government ought to play.
Beginning with California, several states have considered passing data privacy laws at a state level. But splintering the internet into different regulatory regimes could harm both current and future businesses on and offline and limit the options consumers have. Such laws would also likely raise constitutional concerns under the Dormant Commerce Clause.
In the US, the legal right to privacy is generally understood in relation to government bodies, as an outflow from our protections against things like unreasonable search and seizure and self-incrimination. Many of the data problems over which technology commentators agonize have well-developed bodies of analogous precedent in common law.
Another concern is how and when privacy rulemaking should be delegated to administrative agencies. A recent GAO report examined the advantages and disadvantages of delegating data privacy to the FTC, the Federal Communications Commission (FCC), or a new agency entirely. My colleague Andrea O’Sullivan and I discussed this report and concluded that with its experience and consumer welfare focus the FTC is likely the best option.
In any case, if Congress chooses to delegate data privacy issues to an agency, it should be specific in how it does so. Since digital data affects practically every industry from agriculture to the internet, an overly broad grant of authority could turn into a gigantic expansion of the administrative state and merely result in the creation of a heavily regulatory scheme by an agency rather than a legislature.
Clearly, these are not the only questions or concerns that policymakers should consider, but hopefully they will encourage a thoughtful examination of the issues. The decisions made about data privacy will likely shape the future of many industries and should not be rushed into under duress.
Photo credit: Alex Wong/Getty Images