It seems that no week passes by without yet another revelation of how the most successful tech companies play with our data. When your business is driven by data, you are driven to ignore possible data leaks, to question the rights of your users to their own data, and to use double-speak whenever the question of data privacy arises.Here are just a few random, recent examples: Representing Facebook, attorney Orin Snyder argued in court that Facebook users should not have any “reasonable expectation of privacy.” Owners of the popular Google Nest Cam Indoor home security camera found out they could be spied upon in their own home. Apple, who has claimed its seriousness about data privacy as a differentiator (after all, its business is driven more by devices than data), was also found wanting: Without iPhone users’ knowledge, their apps pass data to third parties, even when they sleep. The prevailing Silicon Valley mentality is still “damn the privacy whiners, full data grab ahead.”
“While privacy regulation seeks to make tech companies betters stewards of the data they collect and their practices more transparent, in the end, it is a deception to think that users will have more “privacy.”” For one thing, large tech companies have grown huge privacy compliance organizations to meet their new regulatory obligations.
Increasingly, this attitude stands in sharp contrast to what is happening around the world, where “stringent privacy regulations went mainstream in 2018” and “more will come in 2019.” So says Forrester in a new survey of data privacy rights and regulations in 61 countries.In addition to the EU’s GDPR (which has already generated over €56 million in fines since its implementation in May 2018), California’s CCPA and Brazil’s LGPD privacy regulations—passed last year to go into effect in 2020—more data privacy legislation is in the works, including at other US states like Massachusetts, as well as countries such as India and Japan. Across the EU, different countries are in the process of legislating or implementing their own local tweaks to the GDPR, including additional provisions and requirements.
Forrester’s interactive global map provides information about each country’s data and privacy protections. In addition, by clicking on “view by,” you can compare specific categories across countries. These categories include “scope of protection,” “data transfer to other countries,” and “government surveillance.” On this latter category Forrester says: “Government surveillance is a worldwide phenomenon that cuts across geographies, economic development, societal well-being, and institutional design. In fact, we recorded alarming levels of government surveillance in Austria, Colombia, India, Kuwait, and the UK, just to mention a few.” According to Forrester’s Senior Analyst Enza Iannopollo, "compliance with data protection regulation continue to be the top priority for firms globally. 35% of firms globally are ‘GDPR-ready’ and new, stringent privacy regulation are being discussed and adopted every day.” The common principle that drives these new stringent regulations, says the Forrester report, is “the idea that individuals own their personal data and they should control it at all times, if they wish to so do.”
But given the combined forces of government snooping and data-driven companies’ business requirements, will it be enough to apply our traditional notions of privacy and individual property to data or do we need to re-think what data needs to be protected? Do we need to re-define the scope of the problem?Maciej Cegłowski suggested recently that we should focus the discussion—and regulation—on “ambient privacy.” He wrote:
The question we need to ask is not whether our data is safe, but why there is suddenly so much of it that needs protecting. The problem with the dragon, after all, is not its stockpile stewardship, but its appetite.
This requires us to talk about a different kind of privacy, one that we haven’t needed to give a name to before. For the purposes of this essay, I’ll call it ‘ambient privacy’—the understanding that there is value in having our everyday interactions with one another remain outside the reach of monitoring, and that the small details of our daily lives should pass by unremembered.Because our laws frame privacy as an individual right, we don't have a mechanism for deciding whether we want to live in a surveillance society. Congress has remained silent on the matter, with both parties content to watch Silicon Valley make up its own rules. The large tech companies point to our willing use of their services as proof that people don't really care about their privacy. But this is like arguing that inmates are happy to be in jail because they use the prison library. Confronted with the reality of a monitored world, people make the rational decision to make the best of it.
That is not consent.
Ambient privacy, says Cegłowski, is particularly hard to protect where it extends into social and public spaces outside the reach of privacy law: “If I'm subjected to facial recognition at the airport, or tagged on social media at a little league game, or my public library installs an always-on Alexa microphone, no one is violating my legal rights. But a portion of my life has been brought under the magnifying glass of software.” Most importantly, “telling people that they own their data, and should decide what to do with it, is just another way of disempowering them.” The scope of automated data gathering is so large and the monitoring so constant, that we need to ask—and answer—new questions about how machines and machine learning impact our lives. But, says Cegłowski, “that is not the conversation Facebook or Google want us to have. Their totalizing vision is of a world with no ambient privacy and strong data protections, dominated by the few companies that can manage to hoard information at a planetary scale. They correctly see the new round of privacy laws as a weapon to deploy against smaller rivals, further consolidating their control over the algorithmic panopticon.”
In short, more stringent privacy regulations may help stem the tide of surveillance a bit, but it’s the larger issue of data-driven business models—and data-driven government actions—that needs to be resolved.