Introduction to debug reports
Part 1 of 3 on debugging Attribution Reporting. Learn why debugging matters and when to use debug reports in testing.
This article is part of a series on debugging the Attribution Reporting API. This series covers client-side debugging for event-level reports and aggregatable reports. Server-side debugging guidance for summary reports is available here. A handbook for API integration is available here.
Why you need debug reports
If you're testing the Attribution Reporting API, you should check that your integration is working properly, understand gaps in measurement results between your cookie-based implementation and your Attribution Reporting implementation, and troubleshoot any issues with your integration.
Debug reports are required to complete these tasks. Therefore, if you're participating in the origin trial, we strongly recommend you set up debug reports.
The reporting origin is the origin that sets the Attribution Reporting source header and trigger header. All reports generated by the browser are sent to this origin. In this guidance, we use
https://adtech.example as the example reporting origin.
An attribution report (report for short) is the final report—event-level or aggregatable—that contains the measurement data you’ve requested.
A debug report contains additional data about an attribution report, or about a source or trigger event. Receiving a debug report does not necessarily mean that something is working incorrectly! There are two types of debug reports
A transitional debug report is a debug report that requires a cookie to be set in order to be generated and sent. Transitional debug reports will be unavailable if a cookie is not set, and once third-party cookies are deprecated. All debug reports described in this guide are transitional debug reports.
Key aspects of debug reports
Two types of debug reports
Two types of debug reports are available. Use both, as they fulfill different use cases.
Success debug reports track successful generation of an attribution report. They relate directly to an attribution report. Success debug reports have been available since Chrome 101 (April 2022).
Review example reports in Part 2: Set up debug reports.
Verbose debug reports can track missing reports and help you determine why they're missing. They indicate cases where the browser did not record a source or trigger event, (which means it will not generate an attribution report), and cases where an attribution report can't be generated or sent for some reason. Verbose debug reports also include a
type field that describes the reason why a source event, trigger event or attribution report was not generated.
Verbose debug reports are available starting in Chrome 109 (Stable in January 2023).
Review example reports in Part 2: Set up debug reports.
Debug reports are cookie-based
If the origin configured to receive reports is a third party, this cookie will be a third-party cookie. This has a few key implications:
- Debug reports are only generated if third-party cookies are allowed in the user's browser.
- Debug reports will no longer be available after third-party cookies are phased out.
Debug reports are sent immediately
Success debug reports are generated and sent as soon as the corresponding attribution report is generated: that is, on trigger registration.
Verbose debug reports are sent immediately upon source or trigger registration.
Debug reports have different endpoint paths
- Endpoint for success debug reports, event-level
- Endpoint for success debug reports, aggregatable
- Endpoint for verbose debug reports, event-level and aggregatable.
Learn more in Part 2: Set up debug reports.
Basic real-time integration check
Debug reports are sent to your endpoint immediately, unlike attribution reports which are delayed to protect user privacy. Use debug reports as a real-time signal that your integration with the Attribution Reporting API is working.
Learn how to do this in Part 3: Debugging cookbook.
Unlike third-party cookies, the Attribution Reporting API includes built-in privacy protections, that are designed to strike a balance between utility and privacy. This means that with the Attribution Reporting API, you might not be able to collect all of the measurement data that you currently collect with cookies. Not all the conversions that you can track with third-party cookies will generate an attribution report.
One example: for event-level reports, you can register at most one conversion per impression. This means that for a given ad impression, you will only get one attribution report, no matter how many times the user converts.
Use debug reports to gain visibility into the differences between your cookie-based measurement results and the results you get with the Attribution Reporting API. Pinpoint which conversions are reported, how many conversions are not reported, and specifically which ones and why.
Running a loss analysis means using debug reports to quantify differences between measurement results obtained with cookies and measurement results obtained with Attribution Reporting. A detailed loss analysis also identifies the causes for these differences.
Learn how to run a loss analysis in Part 3: Debugging cookbook.
While loss caused by privacy or resource protections is expected, other loss may be unintended. Misconfigurations in your implementation or bugs in the browser itself can cause reports to go missing.
You can use debug reports to detect and fix an implementation issue on your side, or to report a potential bug to browser teams. Learn how to do this in Part 3: Debugging cookbook.
Advanced configuration check
Some features of the Attribution Reporting API allow you to customize the API's behaviors. Filtering rules, deduplication rules and priority rules are some examples.
When using these features, use debug reports to check that your logic leads to the intended behavior in production, without waiting for attribution reports. Learn how to do this in Part 3: Debugging cookbook.
Local testing with aggregatable reports
Unlike aggregatable attribution reports that are encrypted, aggregatable debug reports include the unencrypted payload.
Use aggregatable debug reports to validate the contents of aggregatable reports, and to generate summary reports with the local aggregation tool for testing.