Privacy Budget

Limit the amount of individual user data exposed to sites to prevent covert tracking.

Implementation status

This document outlines a new proposal for preventing covert tracking: the Privacy Budget.

Why do we need this proposal?

As browsers continue to change how cookies are treated, some user-tracking efforts have moved to harder-to-detect methods which subvert cookie controls. These methods, known as fingerprinting, rely on varied techniques to determine unique browsers, hidden from users.

The Privacy Budget proposal suggests a limit to the amount of individual user data that can be exposed to sites, so that in total it is insufficient to track and identify individuals. This requires quantifying how much users share with third-parties, which may be determined through:

  • K-anonymity: a property possessed by some anonymized data, where "k" is the number of other users with identical information
  • Entropy: an information theory which, when applied, means there's a level of uncertainty inherent to the possible limit of data
  • Differential privacy: a system to ensure that no one individual data can be determined in a set of aggregated data

The maximum tolerance for an amount of information revealed about each user is the privacy budget. The fewer fingerprinting surfaces available to a site and the lower the granularity of information revealed lowers the possibility for identification of any single user.

Measure fingerprinting data

The success of the Privacy Budget proposal relies on browsers estimating the information revealed by each fingerprint surface. Browsers will also need to measure the total information exposed to a site. These measurements will need to be reported back to a single service.

There are a number of possible ways to measure this data, and Chrome is actively exploring solutions.

Reduce total information exposed to sites

Once the total information is measured across the web, we expect to analyze exposed API surfaces to prioritize what information is necessary and what doesn't need to be shared.

In accounting for the privacy budget, data revealed by passive fingerprinting would be assumed to be used by a site. It's important that passive fingerprinting surfaces are reduced, such as achieved by User-Agent reduction and proposed by IP Protection.

How could a privacy budget be enforced?

Once the average site accesses a reasonable amount of data, a budget could be meaningfully enforced by the browser. The Privacy Budget proposal suggests that above a set data threshold, the budget could be enforced in a number of ways. For example:

  • API calls which violate the budget could cause an error;
  • If possible, API calls could be replaced with a privacy-preserving call which returns noised results or generic results which are not tied to a single user;
  • Storage and network requests could be declined, so that the site cannot exfiltrate new information.

Exceptions to the budget

Some applications, such as 3D games and video conferencing, may never be able to run within a reasonable privacy budget. There are some options, including a permissions prompt for users, which could allow those applications to run. How these exceptions will be handled is open to discussion.

When will the Privacy Budget be available?

The earliest date of scaled availability represents the earliest date when Privacy Budget could be enforced. This will not happen before 2024.

At this time, Privacy Budget is a proposal and has not been implemented for any browser.

Engage and share feedback

The Privacy Budget proposal is under active discussion and subject to change in the future.