IAB State of Data 2026: 75% of Marketers Say Ad Measurement Is Broken, AI Is the Fix

The IAB's annual State of Data report surveyed 400+ buy-side leaders and found that up to 75% believe attribution, incrementality, and MMM fall short on rigor, timeliness, and trust. The industry body estimates AI-driven measurement improvements could unlock $26.3 billion in reallocated media spend.

By Marcus Rivera··7 min read

The advertising industry has spent years building increasingly sophisticated measurement stacks — attribution models, incrementality tests, marketing mix models — and according to a landmark IAB report released in February 2026, most of them are not working well enough to justify the investment decisions they inform.

The IAB State of Data 2026 report, subtitled "The AI-Powered Measurement Transformation," surveyed more than 400 senior decision-makers at brands and agencies with planning or analytics expertise. The headline finding is stark: up to 75% of buy-side leaders say that core measurement approaches — attribution analysis, incrementality testing, and marketing mix modeling — underperform on rigor, timeliness, trust, and efficiency.

That is not a fringe complaint. Between 67% and 76% of respondents currently use at least one of these three methodologies. They have adopted the tools. The tools are just not delivering.

The Channel Representation Problem

One of the report's most specific and damaging findings concerns marketing mix models. Not a single respondent believes that all paid channels are well represented in their MMM today. The gaps are predictable but severe: 77% say gaming is underrepresented, roughly half acknowledge that commerce media and the creator economy are overlooked, 46% flag traditional media, and 41% say connected TV is missing from their models.

This is not an abstract modeling problem. When channels are absent from an MMM, the model cannot attribute value to them, which means it systematically overvalues the channels it does include. As PPC Land reported, the result is billions in misallocated spending — dollars flowing to channels that look effective in incomplete models while genuinely incremental channels go unfunded.

The IAB puts a number on the cost of this dysfunction: $26.3 billion in total ad spend that could be reallocated to underrepresented channels if AI-enhanced measurement closed the gaps. An additional $6.2 billion in productivity value could be unlocked by shifting planning time from data preparation to actual strategic analysis.

AI Adoption: Moving Fast, Governing Slowly

The buy-side is not waiting for the measurement ecosystem to fix itself. According to the IAB's findings, approximately half of respondents already scale AI within their current measurement frameworks. Among those not yet scaling, more than 70% expect to begin within one to two years. Only 6% said they have no plans to adopt AI in measurement.

But the adoption curve has a telling asymmetry: analytics teams lead at 69% scaling AI, compared to just 30% of planning teams. Current AI use concentrates heavily on data preparation — cleaning, stitching, and normalizing inputs — rather than the higher-value applications of designing incrementality tests, building attribution models, or running scenario analyses. The implication is that AI is mostly being used to fix the plumbing, not to improve the architecture.

As The Measure reported, the IAB warns that AI without proper governance risks reinforcing the same black-box decision-making that already plagues measurement. The technology can "unify data, automate analysis, and increase measurement speed and frequency," but it can also amplify errors in opaque, unauditable ways if data quality and transparency standards are not in place.

The Governance Gap

The report surfaces a troubling disconnect between AI enthusiasm and AI readiness. Roughly half of respondents cite legal and compliance concerns (51%), AI accuracy and transparency issues (49%), and data security risks (49%) as significant or critical obstacles. Yet fewer than 40% say they currently have or plan to implement solutions to address these risks.

This governance gap has a contractual dimension. AI-related clauses now appear in approximately 40% of brand-agency and partner contracts, a figure the report projects will reach 70-80% within one to two years. The shift suggests that brands are increasingly unwilling to trust AI-driven measurement outputs without contractual accountability mechanisms — audit rights, accuracy guarantees, data governance provisions — baked into their vendor agreements.

Dstillery CEO Michael Beebe, whose company sponsored the report, framed it as an opportunity: measurement improvements create "a flywheel of greater scale, media efficiency, and performance." But the flywheel only works if the data feeding the AI is clean and the models are transparent enough to be challenged.

What This Means for Measurement Teams

The State of Data 2026 report lands at a moment when the measurement industry is already in flux. The IAB used the report's findings to justify the launch of Project Eidos, its ambitious cross-industry initiative to standardize measurement frameworks. Google's Meridian has made open-source MMM accessible. Meta is overhauling its attribution methodology. And incrementality testing has crossed the mainstream adoption threshold.

Against that backdrop, the report's implications for day-to-day measurement work are concrete:

  • Audit your MMM inputs. If your model does not include CTV, commerce media, gaming, or creator spend, it is producing misleading output. The 77% figure on gaming alone should alarm any brand with a meaningful gaming audience.
  • Move AI beyond data prep. Using machine learning to clean and stitch data is table stakes. The competitive advantage comes from applying AI to incrementality test design, causal inference, and real-time model calibration — areas where fewer than a third of teams are currently active.
  • Get governance ahead of adoption. If half the industry is scaling AI in measurement but fewer than 40% have governance frameworks, the gap will produce a wave of measurement failures. Establish audit protocols, accuracy benchmarks, and data lineage tracking before scaling, not after.
  • Watch the contract language. As AI clauses become standard in agency-vendor agreements, measurement teams will need to understand what commitments they are making about model accuracy, data handling, and transparency. This is an area where measurement operations and legal are converging fast.
  • The $32.5 Billion Question

    The combined $26.3 billion in reallocated media spend and $6.2 billion in productivity gains represents the IAB's estimate of what the industry leaves on the table by tolerating broken measurement. Whether AI can actually capture that value depends on factors the report identifies but cannot resolve: data quality across walled gardens, cross-platform identity resolution, privacy-compliant signal recovery, and the willingness of platforms to open their measurement black boxes.

    The State of Data 2026 report is useful precisely because it quantifies the cost of inaction. Three-quarters of the buy-side knows its measurement is inadequate. Half are already deploying AI to fix it. The question is whether the industry can build the governance and standards infrastructure fast enough to ensure AI makes measurement better rather than just faster at being wrong.