Conversion Measurement API

From Bitnami MediaWiki
Jump to navigation Jump to search

Overview

Google's Conversion Measurement API is designed to provide marketers lack-click attribution.[1]

Under the current proposal, Google adds noise (i.e. random additions or subtractions) to the data reported instead of actual conversion data 5% of the time.[2]

The API will send time-delayed feedback associated with each conversion event, which Google suggests should be multiple days after the true event.[3] Google will require provide data at a minimum 48 hours after true events occur.[4]

Google will send the reporting data only if a sufficient number of browsers also perform the same conversion event.[5] Thus for conversion events associated with rare products or services (e.g., mid-tail retail catalog items, home listings), Google will not send this data until a sufficient number of events occur.

Impact

While Google's Conversion Measurement API gives Google's software access to granular cross-site event information to properly match conversions on a marketer property to a prior click event, given the lack of access to other events available via Google's Aggregate Reporting API, it cannot support multi-touch or view-through attribution. Thus, all other ad clicks or views before the conversion are given no credit. By eliminating the view-through attribution, marketers will reduce the value they currently associate publishers on which they run upper funnel branding campaigns.

Acknowledging this issue, Google recently proposed adjusting the API to allow the its web client "to associate a priority with each attribution source" (i.e. event).[6] This adjustment would allow events not associated with a click to trigger attribution reporting. Google would allow marketers the ability to measure up to three conversions associated with prior exposures.[7] Google has not addressed the inaccurate feedback or time delayed issues associated with this proposal.

Given noise is added to the reported information this will impair not only the value marketers ascribe to publisher inventory, but also impair the accuracy of optimization training models used to optimize their budget allocations, bid pricing and messaging. This will likely lead to a degraded user experience as marketers will waste more media spend sending the wrong message to the wrong audience in the wrong context at the wrong time.

Moreover the data from each browser is still sent to Google controlled servers, which then perform the aggregation functions prior to sending the information to publishers, marketers and their supply chain vendors.[8] Google's aggregation service would compute calculations such as reach (distinct users that viewed a given ad). Google is exploring solutions involving either Multi-party Computation and Differential Privacy.[9] However, the time delay and incremental processing cost may make this impractical for programmatic advertising use cases.

Note: Google has made no commitment that it will rely only on Conversion Measurement API to measure and optimize its own web properties. To the contrary, Google suggests marketers should send their conversion data directly to Google (via Customer Match) to improve the measurement of value advertising across Google's owned and operated properties[10] and the advertising Google manages across rival publishers.[11]

Regulator Perspectives

The UK CMA noted (5.30-5.37) that should Google impair the accuracy of scaled measurement data necessary for "(a) publishers to sell ad inventory to advertisers in competition with Google’s ad inventory; and (b)ad tech providers to sell services to publishers and advertisers in the open display market in competition with Google’s ad tech service" would harm competition.[12]

The CMA noted "extensive reach of Google’s user-facing services and its ability to connect data with greater precision (because of its large base of users logged into their Google account) provide Google with a significant data advantage over others."[13]

The CMA also noted Google's marketing materials emphasize the value of accuracy and scale in the measurement and optimization of media across not only multiple contexts, but people's use of multiple applications and devices. " What's more, richness and reach of data remain must-haves for reliable modelling. This means leveraging high-quality data with a comprehensive view across platforms, devices, browsers, and operating systems. Scale should be your top priority when evaluating the right measurement provider for modelling accuracy."[14]

The CMA concluded that there are overarching concerns that Google's "Privacy Sandbox tools will not be effective substitutes for the different forms of functionality provided by TPCs and other information deprecated by the Privacy Sandbox Proposals."[15]

Specifically, the CMA noted that Google's proposal would:

  1. " limit rival ad tech providers’ ability to demonstrate the effectiveness of their services to advertisers and optimise their campaign spend"
  2. impair marketers' ability "to understand which publishers provide better value."

Open Questions

  • What is impact of the time-delay in providing the data necessary for publishers to optimize their revenue?
  • What is impact of the time-delay in providing the data necessary for marketers to optimize their media spend?

References