Data & AnalyticsMarch 13, 2026·7 min read

Why Ad Platform Data Is Often Inaccurate (And What to Use Instead)

Meta reports 2x your actual conversions. Google claims credit for organic sales. Here's why ad platform dashboards are unreliable — and how independent tracking gives you the real numbers.

S

Saud

Co-Founder, ClickPattern

Share
Why Ad Platform Data Is Often Inaccurate (And What to Use Instead)

The Numbers Don't Add Up

Every advertiser who has run campaigns across multiple platforms and compared the numbers has encountered this: the platforms report significantly more conversions than your tracker does. Meta claims 87 purchases. Google claims 64. Your tracker shows 91 total. The maths does not work, unless Meta and Google are each claiming credit for many of the same conversions, which is exactly what is happening.

The discrepancy is not a bug. It is a predictable consequence of how ad platforms are structured. Their attribution systems are designed to maximise the number of conversions they can plausibly claim credit for, because the number of reported conversions directly affects how advertisers evaluate the platform's value and how much budget they allocate. Every platform is simultaneously the ad seller, the ad server, and the measurement system for those ads. That conflict of interest produces systematic over-reporting.

Understanding why this happens, in detail and mechanism by mechanism, is essential for any advertiser trying to make real budget decisions. An independent tracker is the only neutral source of truth, and that is the practical conclusion that follows.

Attribution Window Differences

One of the most straightforward sources of discrepancy is attribution window misalignment. Different platforms use different default windows, and those windows determine which conversions get credited to which ads.

Meta's default attribution window is 7-day click and 1-day view. This means Meta will claim credit for any purchase that happens within 7 days of a click on one of your ads, or within 1 day of a user simply viewing one of your ads without clicking. Google Ads defaults to a 30-day click attribution window for most conversion types. Your third-party tracker typically uses last-click, real-time attribution. A conversion is attributed to the click that immediately preceded it, with no lookback window beyond the session.

The practical implication: a user who clicked your Meta ad on Monday, saw a Google ad on Thursday, and converted on Friday will be claimed by both platforms. Meta counts it because the purchase happened within 7 days of the click. Google counts it because it happened within 30 days of a click and the Google ad was the last platform touchpoint. Your tracker attributes it to whichever click it last saw, probably the Google click, or the direct session if the user typed the URL directly.

This overlap is structural and unavoidable when multiple platforms each run their own attribution logic independently. Neither platform is lying. They are each applying their own rules. But the sum of their claimed conversions will consistently exceed your actual conversion count by whatever proportion of your users touch multiple platforms before converting. In multi-platform campaigns, that proportion is often 30-50% of converting users.

Cross-Device Attribution Gaps

Modern customer journeys frequently span multiple devices. A user sees your TikTok ad on their iPhone during the morning commute. They search for the product on their work laptop at lunch and click a Google search ad. They convert on their home Mac that evening. This is a realistic and common sequence.

Your browser pixel, which sets a cookie on the device where the ad was clicked, cannot track this journey. The cookie set on the iPhone (if ATT allows it at all) is not readable on the laptop. The cookie set on the laptop is not readable on the Mac. Each device's session looks like an independent new visitor. The conversion on the Mac either gets attributed to the most recent referral source the Mac session had, or counts as direct, meaning no ad platform gets credit and your tracker shows a direct conversion.

Ad platforms handle cross-device attribution through logged-in user matching. Meta can link the iPhone app session, the desktop browser session, and the Mac session if the user is logged into Facebook on all three devices. Google can do the same with Google account logins. This is genuinely useful and is part of what makes platform attribution more complete than raw pixel data alone. But it also means the platforms have access to a matching graph that your independent tracker does not, and they use it to claim conversions that your tracker would show as direct or unattributed.

The result is a persistent class of conversions that platforms report but trackers don't. These are cross-device conversions where the connecting thread is the platform's identity graph. You cannot validate these. You cannot independently verify whether the connection they made was accurate. You have to take the platform's word for it, which brings us directly to the question of incentives.

Platform Incentives and Self-Serving Reporting

Every major ad platform has a direct financial incentive to report as many conversions as possible. More reported conversions means a lower reported CPA. A lower reported CPA means your campaign looks profitable. A profitable-looking campaign is one you will continue spending on, and ideally scale. The platform earns more revenue. This incentive structure is not a conspiracy. It is just how the economics work.

This creates a systematic pressure toward attribution choices that maximise reported conversion counts. Longer default attribution windows capture more conversions. View-through attribution adds a whole category of "conversions" that would not exist under click-only models. Cross-device matching claims credit for journeys the platform observed but didn't necessarily cause. Modelled conversions fill in data that iOS restrictions made invisible, using statistical estimates that are optimised to preserve the platform's historical conversion rates.

None of this is unique to any one platform. Meta, Google, TikTok, and every other major ad platform all operate under the same incentive structure. The differences are in degree. Meta's view-through attribution defaults are more aggressive than Google's. Google's cross-network attribution (claiming credit for conversions influenced by YouTube, Search, Display, and Gmail simultaneously) creates more overlap than Meta's more consolidated product set. TikTok's relatively shorter attribution windows make it less prone to this specific issue, but its modelled conversion estimates for iOS traffic have their own accuracy limitations.

The appropriate response is not to ignore platform data. It contains real optimisation signals. Treat it as directional rather than factual, and maintain an independent tracker as your canonical source of truth for budget decisions.

View-Through Attribution Inflation

View-through attribution (VTA) is one of the most significant sources of inflated conversion reporting. It is worth understanding in detail because its impact is often invisible if you are not looking for it.

Under view-through attribution, if a user is served one of your ads, scrolls past it in their feed, or has a video ad play even without interacting with it, and then converts within the attribution window, the ad platform claims credit for that conversion. They do not need to have clicked. They do not need to have watched the video. They simply need to have had the ad delivered to their device within the lookback window.

Meta's default setting includes a 1-day view-through window. This means any user who was served your ad in the last 24 hours and then converted via any path, including direct URL, organic search, or email, is counted as an ad-attributed conversion in Meta's reporting. For high-reach campaigns with broad targeting, you may be serving your ad to hundreds of thousands of people per day. The statistical probability that some percentage of those people would have converted regardless of whether they saw the ad is extremely high. View-through attribution counts all of them.

The practical test for view-through inflation is the holdout experiment: pause your ads for 48 hours (after accounting for any urgency effects on conversion rate) and compare the conversion rate during the pause versus during active campaigns. If Meta was correctly attributing all those view-through conversions to genuine ad influence, you would expect conversion rate to drop significantly during the pause. If it stays roughly stable, a large portion of view-through conversions were people who would have converted anyway.

For most direct-response campaigns, disabling view-through attribution in your campaign settings gives you a significantly cleaner read on actual ad performance. The platform will report fewer conversions, but the conversions it reports will be more causally connected to your ad spend.

The iOS 14+ Impact on Reported Data

Apple's App Tracking Transparency framework, introduced in iOS 14.5 in April 2021, is the single most significant event in mobile advertising measurement of the past decade. It required apps to ask users for explicit permission before tracking them across other apps and websites. Global opt-in rates settled around 25%, meaning 75% of iOS users opted out.

For Meta specifically, whose pixel-based attribution depended heavily on the IDFA (Identifier for Advertisers) to match app events to ad impressions, this was catastrophic. The signal required for accurate mobile attribution disappeared for the majority of iOS users essentially overnight. Meta's response was to develop modelled conversions: a machine learning system that estimates conversion volumes based on historical data and the signals that remain available from the users who did opt in.

Modelled conversions are not fabricated. They are statistically grounded estimates. But they are estimates, and they are estimates optimised to preserve the platform's historical reporting patterns. If your campaign historically converted iOS users at 2.5% and Meta can only directly observe 25% of those conversions now, the model fills in the rest by extrapolation. The problem is that extrapolation assumes the unobserved users behave like the observed ones, which is not guaranteed. It also assumes Meta's model is correctly calibrated, which is difficult to independently verify.

The result is that a significant portion of Meta's reported conversion data for any campaign with substantial iOS traffic is modelled, not measured. The exact proportion is not disclosed. Meta's Events Manager shows whether events are "measured" or "modelled" for some views, but the campaign-level CPA you see in Ads Manager blends both without differentiation. You are likely making budget decisions on a number that is part measurement, part educated guess.

The mitigation is the Meta Conversions API with hashed first-party data, which improves event match quality and reduces the proportion of modelled conversions. But for advertisers who have not implemented CAPI, or whose funnels do not collect first-party identifiers, the modelling proportion remains high.

Why Independent Tracking Is the Solution

An independent click tracker has no stake in how many conversions it reports. It is not selling you ad inventory. It does not benefit from attributing more conversions to any particular source. Its entire value proposition is accurate, complete data, because that is what you are paying for. This alignment of incentives is fundamentally different from the ad platforms' relationship with their own measurement tools.

A tracker's conversion count is deterministic: a conversion is logged if and only if a postback or pixel fires with a matching click ID. There is no modelling, no view-through padding, and no cross-device inference applied without your explicit configuration. What you see is what was measured. If the tracker reports 91 conversions, there were 91 conversion events with valid click ID matches.

This makes the tracker your canonical source of truth for ROI and budget allocation. The question "is this campaign profitable?" should be answered by your tracker's cost and revenue data, not by Meta's or Google's reported CPA. When a campaign shows positive ROI in your tracker, you have real signal to scale on. When it shows negative ROI, you have a real signal to investigate or cut, not an artefact of attribution window overlap or view-through padding.

Independent tracking is also essential for cross-platform budget allocation. If you are running spend on Meta, Google, and TikTok simultaneously and using each platform's native reporting to evaluate performance, you are comparing numbers generated by incompatible attribution systems. Only a neutral third-party tracker with consistent attribution logic across all sources gives you a valid apples-to-apples comparison.

How to Reconcile Platform vs Tracker Data

The goal of reconciliation is not to determine which data source is "right" in some absolute sense. It is to understand what each source is measuring and use each appropriately. Platform data and tracker data are both useful. They are just useful for different things.

Use your tracker data for: true ROI calculation, cross-platform budget allocation, creative performance comparisons, CPA benchmarking, and any decision that requires knowing how much money a campaign actually made relative to what it cost. These are decisions where accuracy matters more than signal volume.

Use your platform data for: campaign optimisation signals, audience insights, creative resonance metrics (thumbstop rate, hook rate, watch time), delivery and reach data, and feeding the platform's own algorithm. These are contexts where the platform has data you don't, its own user behaviour signals, and where directional accuracy is sufficient.

When reconciling the numbers directly, expect and accept a gap. A reasonable rule of thumb: if your tracker shows 80-90% of the conversions your ad platform reports (after excluding view-through and adjusting for window differences), your tracking is in reasonable health. If your tracker shows less than 60% of what the platform reports, investigate. You may have a postback configuration issue, a click ID drop in the funnel, or the platform may be heavily padded with modelled conversions.

To normalise the comparison: in your ad platform, switch the attribution window to match your tracker's model (typically last-click, 1-day or 7-day click only, view-through off). In Meta, this means using the "Compare attribution settings" feature to see conversion counts under click-only windows. This is not the default view. You have to select it. But it brings Meta's reported numbers meaningfully closer to what your tracker shows. The remaining gap is usually explained by cross-device matching that your tracker cannot replicate without platform identity graph access.

Conclusion

Ad platform data is not fraudulent. It is a product of attribution systems that are structurally designed to maximise reported conversion counts, through wide attribution windows, view-through attribution, cross-device inference, and modelled estimates to compensate for iOS data loss. Each of these mechanisms has a plausible rationale, and none of them individually is dishonest. But their combined effect is systematic over-reporting, and making scaling decisions based on platform-reported CPA alone is a reliable way to believe you are profitable when you are not.

The solution is not to ignore platform data. It contains real signals you need for optimisation. The solution is to maintain a clear hierarchy: your independent tracker is the authority on ROI and budget allocation, and platform data is the authority on delivery, audience, and creative performance. Running both together, with a clear understanding of what each measures, is the measurement foundation that serious performance advertisers are built on.

ClickPattern provides the independent tracking layer that sits alongside your ad platforms, giving you deterministic, server-side attribution that is not subject to any platform's incentive to over-report. If you want to see what your campaigns actually look like with accurate data, book a demo and we'll show you how it works.

Ready to fix your tracking?

See how ClickPattern gives you accurate, server-side conversion data across every campaign.

Book a demo
S

Written by

Saud

Co-Founder, ClickPattern

Saud is the co-founder of ClickPattern. He writes about performance marketing, ad tracking, and building data infrastructure that actually works at scale.