ScaledByDesign/Insights
ServicesPricingAboutContact
Book a Call
Scaled By Design

Fractional CTO + execution partner for revenue-critical systems.

Company

  • About
  • Services
  • Contact

Resources

  • Insights
  • Pricing
  • FAQ

Legal

  • Privacy Policy
  • Terms of Service

© 2026 ScaledByDesign. All rights reserved.

contact@scaledbydesign.com

On This Page

The Math Doesn't Add UpHow Double-Counting HappensThe Attribution Overlap ProblemThe Pixel Firing ProblemThe Server-Side / Client-Side OverlapHow to Detect Double-CountingStep 1: The Revenue ReconciliationStep 2: The Order-Level AuditThe Attribution Model That WorksOption 1: First-Party Attribution (Recommended)Option 2: Incrementality TestingFixing Your Tracking Stack
  1. Insights
  2. Split Testing & Tracking
  3. Your Analytics Are Double-Counting Revenue — And Nobody Noticed

Your Analytics Are Double-Counting Revenue — And Nobody Noticed

February 23, 2026·ScaledByDesign·
analyticsattributionrevenue-trackingecommercedata

The Math Doesn't Add Up

A DTC brand showed us their monthly performance report. According to their analytics:

  • Meta Ads attributed: $620K in revenue
  • Google Ads attributed: $410K in revenue
  • Email/SMS attributed: $280K in revenue
  • Organic/Direct attributed: $190K in revenue
  • Total attributed: $1.5M

Their actual Shopify revenue for that month: $890K.

They were double-counting 68% of their revenue. Every channel was taking credit for the same sales. And they were making budget decisions based on these inflated numbers.

This isn't an edge case. Across the DTC brands we audit, 15-30% revenue inflation from double-counting is the norm. Some are much worse.

How Double-Counting Happens

The Attribution Overlap Problem

Every ad platform uses its own attribution model, and every one of them is generous to itself:

Customer Journey Example:
  Day 1: Clicks Meta ad → browses, leaves
  Day 3: Searches on Google → clicks Google ad → browses, leaves
  Day 5: Opens email → clicks through → browses, leaves
  Day 7: Types URL directly → purchases $150

Who gets credit?
  Meta:     $150 (7-day click attribution window)
  Google:   $150 (30-day click attribution window)
  Klaviyo:  $150 (5-day click attribution)
  Direct:   $150 (last-touch in GA4)

  Total attributed: $600
  Actual revenue:   $150
  Inflation:        4x

Each platform reports the full purchase value as "their" conversion. Nobody deduplicates.

The Pixel Firing Problem

Even within a single platform, pixels can double-fire:

Common double-fire scenarios:
  1. Thank-you page reload → pixel fires twice
  2. Order confirmation email click → lands on thank-you page → pixel fires again
  3. Multiple pixels on same page (e.g., GA4 + Meta + TikTok all fire "purchase")
  4. Server-side AND client-side tracking both active → two events per purchase
  5. Test/staging orders not filtered → fake revenue counted
// The most common pixel implementation mistake
// This fires on EVERY page load of the thank-you page
// Including refreshes, back-button navigations, and email clicks
useEffect(() => {
  fbq("track", "Purchase", {
    value: order.total,
    currency: "USD",
  });
  // BUG: No deduplication. No check if already fired.
}, []);
 
// Correct implementation
useEffect(() => {
  const eventId = `purchase_${order.id}`;
 
  // Check if this specific purchase was already tracked
  if (sessionStorage.getItem(eventId)) return;
 
  fbq("track", "Purchase", {
    value: order.total,
    currency: "USD",
    eventID: eventId, // Meta uses this for dedup
  });
 
  sessionStorage.setItem(eventId, "true");
}, [order.id]);

The Server-Side / Client-Side Overlap

If you're running both server-side and client-side tracking (which Meta's CAPI encourages), you need deduplication:

// Server-side (Conversions API)
await fetch("https://graph.facebook.com/v18.0/{pixel_id}/events", {
  method: "POST",
  body: JSON.stringify({
    data: [{
      event_name: "Purchase",
      event_time: Math.floor(Date.now() / 1000),
      event_id: `purchase_${order.id}`, // MUST match client-side eventID
      user_data: hashUserData(customer),
      custom_data: {
        value: order.total,
        currency: "USD",
      },
    }],
  }),
});
 
// Client-side (Pixel)
fbq("track", "Purchase", {
  value: order.total,
  currency: "USD",
}, { eventID: `purchase_${order.id}` }); // Same event_id = Meta deduplicates

Without matching event_id values, Meta counts both events as separate purchases.

How to Detect Double-Counting

Step 1: The Revenue Reconciliation

Pull total attributed revenue from every channel and compare against actual revenue:

-- Pull actual revenue from your source of truth (Shopify, Stripe, etc.)
SELECT
  DATE_TRUNC('month', created_at) AS month,
  SUM(total_price) AS actual_revenue,
  COUNT(*) AS order_count
FROM orders
WHERE financial_status = 'paid'
  AND created_at >= '2026-01-01'
GROUP BY 1
ORDER BY 1;
Revenue Reconciliation:
  Channel          | Attributed  | % of Actual
  -----------------+-------------+-----------
  Meta Ads         | $620K       | 70%

### Step 2: The Order-Level Audit

Pick 50 random orders and trace each one through every analytics platform:

```typescript
// Order-level attribution audit script
async function auditOrder(orderId: string) {
  const order = await shopify.getOrder(orderId);

  const attributions = {
    meta: await metaAds.getConversions({ orderId }),
    google: await googleAds.getConversions({ orderId }),
    klaviyo: await klaviyo.getAttributedRevenue({ orderId }),
    ga4: await ga4.getConversion({ transactionId: orderId }),
  };

  const totalAttributed = Object.values(attributions)
    .reduce((sum, attr) => sum + (attr?.revenue || 0), 0);

  return {
    orderId,
    actualRevenue: order.total,
    totalAttributed,
    inflationRatio: totalAttributed / order.total,
    platforms: attributions,
  };
}

If the average inflation ratio is above 1.5x, you have a serious problem.

The Attribution Model That Works

Stop relying on platform-reported attribution. Build a single source of truth:

Option 1: First-Party Attribution (Recommended)

Track the customer journey yourself using first-party data:

// First-party attribution tracking
const attributionTracker = {
  // Track every touchpoint in YOUR system
  trackTouch: (event: TouchEvent) => {
    const touchpoint = {
      timestamp: Date.now(),
      sessionId: getSessionId(),
      userId: getUserId(),
      source: event.utmSource || "direct",
      medium: event.utmMedium || "none",
      campaign: event.utmCampaign || "none",
      channel: classifyChannel(event),
      landingPage: event.url,
    };
 
    appendToJourney(touchpoint);
  },
 
  // When purchase happens, attribute to the FULL journey
  attributePurchase: (order: Order) => {
    const journey = getCustomerJourney(order.customerId);
 
    return {
      orderId: order.id,
      revenue: order.total,
      firstTouch: journey[0],           // Discovery channel
      lastTouch: journey[journey.length - 1], // Conversion channel
      allTouches: journey,              // Full journey
      // Custom model: weight by position and recency
      attribution: calculateWeightedAttribution(journey, order.total),
    };
  },
};

Option 2: Incrementality Testing

The gold standard for attribution accuracy — run experiments to measure actual lift:

Incrementality Test Setup:
  1. Split audience randomly: 50% test, 50% control
  2. Test group:    Sees Meta ads as normal
  3. Control group: Sees PSA or no ads (Meta holdout)
  4. Measure:       Conversion rate difference = true incremental lift

  Results example:
    Test group conversion:    2.4%
    Control group conversion: 1.8%
    Incremental lift:         0.6% (25% of conversions are truly incremental)
    Meta's reported ROAS:     4.2x
    Actual incremental ROAS:  1.05x  ← Very different number

This is uncomfortable data. But it's the truth.

Fixing Your Tracking Stack

Here's the remediation plan we implement for clients:

Week 1: Audit
  → Run revenue reconciliation across all platforms
  → Audit pixel implementations for double-fires
  → Check server-side / client-side deduplication

Week 2: Fix
  → Add event_id deduplication to all purchase events
  → Fix thank-you page pixel firing (fire once, verify)
  → Standardize UTM parameters across all channels

Week 3: Build
  → Implement first-party attribution tracking
  → Set up order-level attribution dashboard
  → Create single source of truth for revenue attribution

Week 4+: Validate
  → Run incrementality tests on top 2-3 channels
  → Compare first-party attribution to platform attribution
  → Adjust budgets based on real data

The brands making the best budget decisions aren't the ones with the fanciest attribution models. They're the ones who know their data is clean, deduplicated, and reconciled against actual revenue.

Stop trusting platform-reported numbers. Start counting your money yourself.

Previous
Trunk-Based Development Actually Works — If You Do These 5 Things
Insights
Your Analytics Are Double-Counting Revenue — And Nobody NoticedA/B Testing Is Lying to You — Statistical Significance Isn't EnoughServer-Side Split Testing: Why Client-Side Tools Are Costing You RevenueThe Tracking Stack That Survives iOS, Ad Blockers, and Cookie DeathHow to Run Pricing Experiments Without Destroying TrustYour Conversion Rate Is a Vanity Metric — Here's What to Track InsteadBuilding a Feature Flag System That Doesn't Become Technical DebtThe Data Layer Architecture That Makes Every Test Trustworthy

Ready to Ship?

Let's talk about your engineering challenges and how we can help.

Book a Call