Microsoft’s Multi-party Computation of Ads on the Web (Macaw) aims to improve accuracy of publisher inventory monetization and advertiser engagement without needing to share confidential information (e.g., “bid models, model parameters, or auction functions with the browser service” while balancing this against the “granularity of contextual signals and reducing the granularity of user features to make it difficult for any participant in the system to map a specific ad request to a specific user.” 
Macaw adopts Google’s Privacy Sandbox goal of disintermediating publishers from the marketers that fund their digital properties.
Macaw builds on the trusted-server model of Parakeet that is also required by Fledge. Unlike Fledge, Macaw allows real-time computation of user and current context signals to influence bidding logic. The improved accuracy of Macaw ensures marketers can benefit from brand safety rules. Macaw also enables publisher “quality” filters on the ads that will appear on their digital properties. Unlike Sparrow or Parakeet that merely proxy the logic to compute or information in a real-time ad request, Macaw relies on Multi-party Computation to push the processing to each supply chain vendor chosen by the digital marketplace of publishers and marketers.
Macaw sits between DSPs and SSPs to remove any user identifiers and compute the responses received from both the marketers’ DSPs and publisher SSPs for a given ad request. Macaw relies on Parakeet for the first step of removing the identifier and sending through the context and user attributes to DSP to calculate eligible ads and bids. However, instead of sharing the bid response based on inputs directly with the first step, a second step encodes the information and sends the encoded information to the same recipients. The recipients return DSP bid response data, and optional SSP filtering logic, for the encoded information to the same service.
Because the browser service can link the bid request to the winning ad (e.g., to generate Aggregate Reporting API) it is unclear how the DSPs can ensure the browser service is not reverse engineering the confidential model parameters associated with features that influence the value for a given ad.
Given the incremental overhead cost and latency of multi-party computation, there is a limit to how many participants responses can participate in a given auction, which creates a natural limit on the number of participants that can participate in digital auctions. This will reduce competition and impair advertiser performance, as well as publisher revenues.
- What is the process for determining which organizations' servers are trusted?
- What is the time-delay in providing the data necessary for publishers to optimize their revenue?
- What is the time-delay in providing the data necessary for marketers to optimize their media spend?