What broke, and when
Digital attribution worked for about a decade. A user clicked a Facebook ad, a cookie tagged them, they came back a week later, converted, and the Meta Pixel proudly claimed the revenue. Last-click on a Google Analytics dashboard was broadly trustable. Marketers could defend budgets with screenshots.
Four things, stacked over five years, ended that era.
- iOS 14.5 (April 2021) introduced App Tracking Transparency. Over 70 percent of iOS users decline tracking. The IDFA became useless for attribution.
- Safari ITP capped first-party cookies set by JavaScript to 7 days, and third-party cookies to zero. Chrome has been slower but is heading the same direction.
- Consent Mode and GDPR mean 20 to 40 percent of EU and AU users now decline tracking at the banner.
- Dark social (Slack, Discord, WhatsApp, podcasts, LinkedIn DMs) drives a huge share of high-intent traffic that shows up as Direct in GA4.
The net effect: a Meta Ads Manager showing AUD $100k in attributed revenue is probably reporting somewhere between $40k and $180k of real incremental revenue. Nobody knows. The error bars got wider every year.
Why multi-touch attribution did not save us
Multi-touch attribution (MTA) tools promised to fix last-click by distributing credit across the full journey. In theory, elegant. In practice, MTA still depends on deterministic cookie tracking across domains. Exactly the thing that broke.
Most MTA tools in 2026 are re-weighted last-click, dressed up with a prettier UI. The underlying data is as lossy as GA4 or Meta's self-reported numbers. Paying AUD $3,000 a month for a "unified attribution platform" is mostly paying for a dashboard.
We are not saying do not use GA4. We are saying do not bet your budget decisions on it alone.
The new stack: four things that actually work
What we run for clients in 2026 is not one tool. It is a combination of four approaches, each answering a different question. None of them alone is enough.
1. Incrementality testing (geo holdouts)
Incrementality testing is the gold standard because it does not care about tracking. It cares about behaviour.
The setup: split your market into matched regions, turn paid media off in half of them for 4 to 8 weeks, and measure the revenue delta. If conversions drop 15 percent in the holdout regions while the test regions hold steady, your ads caused 15 percent of revenue. Real, testable truth. For an Australian retailer, that might mean running ads in Sydney and Brisbane while pausing them in Melbourne and Perth, then comparing.
Meta's Conversion Lift is a lightweight version built into the platform, and it is free. Google's Conversion Lift is similar. For non-platform-specific tests, tools like Haus and INCRMNTAL automate geo experiments end-to-end. For SMBs, a manual holdout run once or twice a year is enough to calibrate everything else.
2. Media Mix Modelling (MMM), lightweight version
MMM used to be consultancy-only. Six-figure engagements, months of work, proprietary models. That changed when Meta open-sourced Robyn and Google released Meridian. Both are open-source Bayesian models that take weekly spend by channel, weekly revenue, and optional external factors (seasonality, promotions, iOS events) and output a saturation curve per channel.
For a small business running Google Ads, Meta, LinkedIn, and organic content, two years of weekly data is enough for a useful model. You run it quarterly. You stop trusting Meta's self-reported ROAS and start trusting the model's estimate, which lands 30 to 60 percent lower and closer to reality.
The trap: MMM requires disciplined data. If your spend data is messy, if you do not track weekly revenue cleanly, if you change strategies every month, the model output will be noise. We spend the first month with a client cleaning the data pipeline before the first run.
3. Post-purchase surveys
The cheapest, highest-signal tool of the four. A single-question survey on the order confirmation page: "How did you hear about us?" with 6 to 10 options including Google, Facebook or Instagram, Friend or family, Podcast, TikTok, and Other.
On a DTC store, response rates are 40 to 70 percent. On a lead form thank-you page, 30 to 50 percent. The data is self-reported, so imperfect. But it is the only clean signal for dark social: the Slack recommendations, the podcast mentions, the mate who texted "try these guys". Tools like KnoCommerce for Shopify make this trivial to install.
We compare post-purchase survey attribution to GA4 channel attribution monthly. When they disagree substantially (for example, GA4 says 40 percent organic search but surveys say 12 percent), GA4 is wrong. It is catching branded search from people who heard about you elsewhere.
4. First-party CRM data
The quiet foundation. Every form fill, purchase, and email signup is a real event that happened on your infrastructure, independent of any tracking pixel.
Sending that data into your marketing platforms via server-side tagging (we wrote about SST on Gatsby and Next.js) and into the ad platforms via Meta CAPI and Google Enhanced Conversions restores 30 to 50 percent of the conversions that client-side pixels lose. It also lets you build lookalike audiences from actual customers rather than cookied visitors.
Keep every lead and customer in a first-party CRM. HubSpot, Customer.io, or Klaviyo depending on business shape (see our comparison). That CRM is the ground truth everything else calibrates against.
A practical stack for an SMB in 2026
For an AUD $50k to $500k per year marketing budget, the stack we recommend is:
- GA4 with Consent Mode v2 and server-side tagging
- Meta CAPI and Google Enhanced Conversions for platform-side measurement
- A first-party CRM as the ground truth
- KnoCommerce or similar for post-purchase surveys
- Robyn or Meridian run quarterly for budget allocation
- One Meta Conversion Lift test per year for incrementality calibration
Total tooling cost is AUD $150 to $400 per month plus internal time. The decisions it produces are dramatically better than a $3,000 per month MTA platform.
What to stop paying for
Worth saying plainly.
- "Unified attribution" SaaS that charges four figures monthly and is internally just re-weighted last-click.
- Third-party tracking scripts that add weight to your site without producing defensible numbers.
- Dashboards nobody opens. If a report does not change a decision, kill it.
Getting started
At CodeDrips, we help clients build an attribution stack that survives browser changes and drives better budget decisions. If your current dashboards contradict each other and nobody trusts the numbers, let's fix that together.


