Privacy is paramount to us, in everything we do. So today, we are announcing a new initiative to develop a set of open standards to fundamentally enhance privacy on the web. We’re calling this a Privacy Sandbox.
Technology that publishers and advertisers use to make advertising even more relevant to people is now being used far beyond its original design intent - to a point where some data practices don’t match up to user expectations for privacy. Recently, some other browsers have attempted to address this problem, but without an agreed upon set of standards, attempts to improve user privacy are having unintended consequences.
First, large scale blocking of cookies undermine people’s privacy by encouraging opaque techniques such as fingerprinting. With fingerprinting, developers have found ways to use tiny bits of information that vary between users, such as what device they have or what fonts they have installed to generate a unique identifier which can then be used to match a user across websites. Unlike cookies, users cannot clear their fingerprint, and therefore cannot control how their information is collected. We think this subverts user choice and is wrong.
Second, blocking cookies without another way to deliver relevant ads significantly reduces publishers’ primary means of funding, which jeopardizes the future of the vibrant web. Many publishers have been able to continue to invest in freely accessible content because they can be confident that their advertising will fund their costs. If this funding is cut, we are concerned that we will see much less accessible content for everyone. Recent studies have shown that when advertising is made less relevant by removing cookies, funding for publishers falls by 52% on average1.
So we are doing something different. We want to find a solution that both really protects user privacy and also helps content remain freely accessible on the web. At I/O, we announced a plan to improve the classification of cookies, give clarity and visibility to cookie settings, as well as plans to more aggressively block fingerprinting. We are making progress on this, and today we are providing more details on our plans to restrict fingerprinting. Collectively we believe all these changes will improve transparency, choice, and control.
But, we can go further. Starting with today’s announcements, we will work with the web community to develop new standards that advance privacy, while continuing to support free access to content. Over the last couple of weeks, we’ve started sharing our preliminary ideas for a Privacy Sandbox - a secure environment for personalization that also protects user privacy. Some ideas include new approaches to ensure that ads continue to be relevant for users, but user data shared with websites and advertisers would be minimized by anonymously aggregating user information, and keeping much more user information on-device only. Our goal is to create a set of standards that is more consistent with users’ expectations of privacy.
We are following the web standards process and seeking industry feedback on our initial ideas for the Privacy Sandbox. While Chrome can take action quickly in some areas (for instance, restrictions on fingerprinting) developing web standards is a complex process, and we know from experience that ecosystem changes of this scope take time. They require significant thought, debate, and input from many stakeholders, and generally take multiple years.
To move things forward as quickly as possible, we have documented the specific problems we are trying to solve together, and we are sharing a series of explainers with the web community. We have also summarized these ideas today on the Chromium blog.
I’m Not an International Drug Dealer
We look forward to getting feedback on this approach from the web platform community, including other browsers, publishers, and their advertising partners. Thank you in advance for your help and input on this process - we believe that we must solve these problems together to ensure that the incredible benefits of the open, accessible web continue into the next generation of the internet.1 Google Ad Manager data; n=500 global publishers; Analysis based on an A/B experiment where cookies are disabled on a randomly selected fraction of each publisher's traffic; May-August 2019. More information available on the Google ads blog.
Google has released the TensorFlow Privacy tool, an update to its open source TensorFlow machine learning framework, that will allow developers to enhance the privacy of their AI ( artificial intelligence ) models. The use of differential privacy essentially means that AI models trained on user data can’t encode personally identifiable information.