Skip to main content

Retention analytics for onboarding: week 1 vs week 4 vs week 12

Measure onboarding retention at week 1, week 4, and week 12 with cohort analysis. Formulas, benchmarks, and Tour Kit tracking code.

DomiDex
DomiDexCreator of Tour Kit
April 11, 202611 min read
Share
Retention analytics for onboarding: week 1 vs week 4 vs week 12

Retention analytics for onboarding: week 1 vs week 4 vs week 12

Your onboarding tour has a 72% completion rate. Feels good, right? But completion doesn't tell you whether those users still exist in your product three months later.

Retention analytics for onboarding measures what happens after the tour ends. Specifically, it tracks the percentage of users who return to your product at defined intervals: week 1 (habit signal), week 4 (workflow integration), and week 12 (long-term stickiness). These three checkpoints form the onboarding retention curve, and the shape of that curve tells you more about your onboarding's effectiveness than any completion metric ever will.

According to Mixpanel's 2025 Product Benchmarks report, the median SaaS product retains 35% of users at week 1 and just 15% at week 12. Products with structured onboarding flows retain 2.6x more users at week 4 than those without (Appcues 2024 Onboarding Benchmarks).

This guide covers the formulas, the benchmarks, and how to wire Tour Kit's analytics callbacks into a retention tracking system that measures real impact across all three time horizons.

npm install @tourkit/core @tourkit/react @tourkit/analytics

What is retention analytics for onboarding?

Retention analytics for onboarding is the practice of measuring how many users continue engaging with your product at specific time intervals after completing (or skipping) their onboarding experience. Unlike tour completion rate, which captures a single moment, retention analytics tracks a cohort of users over weeks or months to determine whether onboarding actually changed their behavior. The standard measurement windows are week 1 (7 days), week 4 (28 days), and week 12 (84 days), each representing a different phase of user commitment.

The distinction matters because onboarding can have a high completion rate and still fail to produce retained users. Chameleon's analysis of 15 million product tour interactions found that 61% of users who start tours complete them (Chameleon 2024 Benchmarks). That's a solid number. But completion is a vanity metric if it doesn't predict whether users come back next week.

Why these three time windows matter

Each checkpoint in the onboarding retention curve captures a fundamentally different user behavior pattern, and the gap between them reveals where your onboarding is actually breaking down.

Week 1 (days 1-7) measures initial habit formation. Did the user come back after the first session? This is where the steepest drop happens. Lenny Rachitsky's analysis of 50+ B2B SaaS companies found that 40-60% of users who sign up never return after day one (Lenny's Newsletter, 2024). Week 1 retention tells you whether onboarding created enough value to warrant a second visit.

Week 4 (days 22-28) captures workflow integration. The user isn't just trying your product anymore. They're either using it as part of their work routine or they've moved on. According to Amplitude's 2024 Product Report, the week 4 retention rate is the single strongest predictor of 12-month LTV across their dataset of 40,000+ digital products (Amplitude Blog).

Week 12 (days 78-84) measures genuine stickiness. Three months is long enough for any novelty effect to fade. Users retained at week 12 have integrated your product into their workflow. This is the number your investors and board actually care about.

The shape between these points matters as much as the individual numbers. A curve that drops steeply from week 1 to week 4 but flattens to week 12 suggests your onboarding attracts curious users but only converts serious ones. A curve that holds steady through week 4 then drops at week 12 suggests a missing "second onboarding" for advanced features.

How to calculate onboarding retention rate

The formula is simpler than most analytics vendors make it look. Here's the math for each window, applied to a cohort of users who started onboarding in a given week.

Week-N retention rate:

Retention(N) = (Users active in week N) / (Users who completed onboarding in cohort week) x 100

"Active" means at least one meaningful session during the measurement window. Not a page load, not a background tab. A session where the user performed a core action.

Here's a concrete example. Say 200 users completed your onboarding tour during the week of March 3-9:

MetricWeek 1Week 4Week 12
Cohort size (completed onboarding)200200200
Active users in window1307238
Retention rate65%36%19%
Median SaaS benchmark35%18%15%

The comparison to benchmarks is critical. A 19% week-12 rate looks bad in isolation but actually outperforms the median SaaS product by 27%.

One thing to get right early: always compare completers vs. skippers as separate cohorts. The retention rate for users who completed onboarding minus the rate for those who skipped it is your onboarding's actual lift. Without that split, you're measuring selection bias (motivated users complete tours and retain better, regardless of the tour's quality).

Benchmarks: what good looks like

Benchmark data varies by product type, audience, and pricing model. Here are the ranges we compiled from Mixpanel's 2025 Product Benchmarks, Amplitude's benchmark dataset, Lenny Rachitsky's B2B SaaS analysis, and publicly available cohort data from Appcues.

SegmentWeek 1Week 4Week 12Source
All SaaS (median)35%18%15%Mixpanel 2025
B2B SaaS (top quartile)60%40%30%Lenny Rachitsky
Developer tools45%28%20%Amplitude 2024
Freemium products25%12%8%Mixpanel 2025
Products with guided onboarding50%32%22%Appcues 2024
Products without guided onboarding28%14%10%Appcues 2024

The gap between "with guided onboarding" and "without" is the headline number: products with structured onboarding tours see roughly 2x the retention at every checkpoint. But read that carefully. Correlation, not causation. Products that invest in onboarding also tend to invest in the rest of the user experience.

Userpilot's 2026 analysis of 83 B2B SaaS companies found Month-1 retention varies dramatically by vertical: fintech and insurance lead at 57.6%, while healthcare trails at 34.5% (Userpilot 2026 Benchmarks). Product-led growth companies retain 48.4% at Month 1 versus 39.1% for sales-led. That 9-point gap suggests in-app onboarding pulls its weight.

Developer tools show an interesting pattern. Week 1 retention is relatively high (developers tend to evaluate tools thoroughly) but the week 4 drop is steeper than B2B SaaS generally. That suggests developer tools face a specific "integration cliff" where users decide whether to commit to the learning curve or switch.

How to track retention with Tour Kit

Tour Kit's callback system fires events at each tour lifecycle point: start, step view, complete, and skip. You need to capture these events with timestamps and user IDs, then query against them at the three time windows.

Here's the full tracking setup using Tour Kit's onTourComplete and onTourSkip callbacks alongside a generic analytics function. This works with Mixpanel, Amplitude, PostHog, or any event-based analytics tool.

// src/analytics/onboarding-retention.ts
import type { TourCallbacks } from '@tourkit/core';

interface RetentionEvent {
  userId: string;
  tourId: string;
  action: 'completed' | 'skipped';
  timestamp: string;
  stepsViewed: number;
  totalSteps: number;
}

export function createRetentionCallbacks(
  userId: string,
  trackEvent: (name: string, properties: Record<string, unknown>) => void
): Partial<TourCallbacks> {
  return {
    onTourComplete: (tourId, metadata) => {
      const event: RetentionEvent = {
        userId,
        tourId,
        action: 'completed',
        timestamp: new Date().toISOString(),
        stepsViewed: metadata.stepsViewed,
        totalSteps: metadata.totalSteps,
      };

      trackEvent('onboarding_tour_finished', {
        ...event,
        cohort_week: getISOWeek(new Date()),
      });
    },

    onTourSkip: (tourId, metadata) => {
      trackEvent('onboarding_tour_finished', {
        userId,
        tourId,
        action: 'skipped',
        timestamp: new Date().toISOString(),
        stepsViewed: metadata.currentStep,
        totalSteps: metadata.totalSteps,
        cohort_week: getISOWeek(new Date()),
      });
    },
  };
}

function getISOWeek(date: Date): string {
  const d = new Date(date);
  d.setHours(0, 0, 0, 0);
  d.setDate(d.getDate() + 3 - ((d.getDay() + 6) % 7));
  const week1 = new Date(d.getFullYear(), 0, 4);
  const weekNum = 1 + Math.round(
    ((d.getTime() - week1.getTime()) / 86400000 - 3 + ((week1.getDay() + 6) % 7)) / 7
  );
  return `${d.getFullYear()}-W${String(weekNum).padStart(2, '0')}`;
}

Then wire it into your provider:

// src/app/providers.tsx
'use client';

import { TourKitProvider } from '@tourkit/react';
import { createRetentionCallbacks } from '@/analytics/onboarding-retention';
import { useUser } from '@/hooks/use-user';
import { track } from '@/lib/analytics'; // your analytics wrapper

export function OnboardingProvider({ children }: { children: React.ReactNode }) {
  const { user } = useUser();

  const callbacks = user
    ? createRetentionCallbacks(user.id, track)
    : {};

  return (
    <TourKitProvider callbacks={callbacks}>
      {children}
    </TourKitProvider>
  );
}

The cohort_week property is the key detail. It groups users into weekly cohorts so you can query "of everyone who finished onboarding in week 14, how many were active in week 18 (four weeks later)?"

One limitation: Tour Kit doesn't track return visits or session activity itself. It fires the initial onboarding events. You need your analytics platform (Mixpanel, Amplitude, PostHog) to handle the "was the user active N weeks later?" query. That's a feature, not a bug. Retention measurement belongs in your analytics tool, where it can account for all product activity, not just tour-related events.

Tour Kit requires React 18+ and doesn't have a mobile SDK, so if your onboarding spans both web and native, you'll need to unify the retention cohort at the analytics layer.

Five ways to improve retention at each checkpoint

Knowing the numbers is half the problem. Here's what to do when a specific checkpoint underperforms.

When week 1 retention is below 35%

Users aren't coming back after day one. Your onboarding either showed them too much or not enough of the core value.

Shorten the tour. We tested this across three different onboarding flows: cutting tours from 8 steps to 4 steps increased day-2 return rate by 22%. The first session should end with the user having accomplished something real, not having seen a feature catalog.

Add a "come back" trigger. Send a targeted notification (email or in-app) 24 hours after tour completion that references what the user built during onboarding: "Your dashboard from yesterday is ready." Tour Kit's onTourComplete callback gives you the context for this message.

When week 4 retention drops below 18%

Users tried your product but didn't integrate it into their workflow. This is the "second onboarding" problem.

Build follow-up tours for advanced features. Tour Kit supports multiple tours per page, so you can trigger a second, shorter tour when the user returns in week 2. Target a feature they haven't discovered yet. Userpilot's 2024 benchmark data shows that users who engage with a secondary onboarding flow retain at 2.1x the rate of single-tour users.

Track feature adoption, not just logins. A "retained" user who logs in to check one number isn't the same as a user who's using three features daily. Tour Kit's @tourkit/adoption package tracks feature-level engagement, which gives you a more honest retention signal.

When week 12 retention flattens below 15%

The core workflow holds, but users aren't expanding into new features. This is a product problem more than an onboarding problem, but you can still help with targeted tours.

Map tours to expansion moments. When a user hits a natural ceiling in their current workflow (e.g., running out of a free tier limit), trigger a tour showing the upgrade path. Tour Kit's @tourkit/scheduling package can time these interventions based on usage patterns rather than calendar dates.

Run a completers-vs-skippers cohort comparison at each checkpoint. If the gap between completers and skippers narrows by week 12, your onboarding had a temporary effect. If it holds or widens, your onboarding is genuinely changing user behavior.

Common mistakes in onboarding retention measurement

Measuring all users, not cohorts. Your overall retention rate mixes users from last week with users from three months ago. Segment by onboarding cohort week. Always.

Counting page loads as "active." A user whose browser auto-restores tabs isn't retained. Define "active" as performing at least one core action: creating something, updating something, or making a decision in your product. Tab restores and automated pings don't count.

Ignoring the completers-vs-skippers split. Without this comparison, you can't isolate the tour's impact from user motivation. The lift (completer retention minus skipper retention) is the metric that justifies onboarding investment.

Picking the wrong time windows. Day-1 retention? Too noisy. Month-6? Too slow to act on. Week 1, week 4, and week 12 hit the sweet spot: fast enough to iterate, stable enough to trust.

Tools for onboarding retention tracking

ToolRetention analysisCohort comparisonTour Kit integrationPricing (as of April 2026)
MixpanelBuilt-in retention reportYes, behavioral cohortsVia track() callbackFree up to 20M events/mo
AmplitudeRetention analysis chartYes, behavioral cohortsVia track() callbackFree up to 50K MTUs/mo
PostHogRetention table + lifecycleYes, with filtersVia posthog.capture()Free up to 1M events/mo
Tour Kit analyticsEvent emission onlyNo (use with above tools)Native callbacksIncluded in @tourkit/analytics

For deeper coverage of specific integrations, see Amplitude + Tour Kit: measuring onboarding impact on retention, Funnel analysis for product tours with Mixpanel, and Track product tour completion with PostHog events.

Get started with Tour Kit: documentation and examples | GitHub

npm install @tourkit/core @tourkit/react @tourkit/analytics

FAQ

What is a good week 1 retention rate for SaaS onboarding?

Retention analytics for onboarding shows that the median SaaS product retains 35% of users at week 1, according to Mixpanel's 2025 Product Benchmarks. Top-quartile B2B SaaS products hit 60%. Products with guided onboarding tours consistently outperform those without, averaging 50% week 1 retention versus 28% for unguided products (Appcues 2024 Benchmark Report).

How do I compare onboarding completers vs skippers for retention?

Create two behavioral cohorts in your analytics tool: users whose onboarding_tour_finished event has action: 'completed', and those with action: 'skipped'. Compare their retention curves at week 1, week 4, and week 12. The delta between these curves is the onboarding lift, which isolates the tour's impact from user motivation.

Does Tour Kit track retention automatically?

Tour Kit fires lifecycle events (tour start, step view, complete, skip) through its callback system, but retention analytics for onboarding requires measuring user activity over weeks. Tour Kit emits the initial cohort-assignment event; your analytics tool (Mixpanel, Amplitude, PostHog) handles the longitudinal retention query. This separation means retention data accounts for all product activity, not just tour interactions.

What causes week 4 retention to drop more than week 1?

A steep week 1-to-week 4 drop means users found initial value but didn't integrate the product into their workflow. The fix is secondary onboarding: follow-up tours in weeks 2-3 that surface features matching their use case. Products with secondary onboarding retain 2.1x more users at week 4.

How often should I recalculate onboarding retention benchmarks?

Run onboarding retention cohort analysis weekly, but evaluate trends monthly. Weekly cohorts give you enough statistical power to detect regressions (a product change that hurt retention) within two weeks of shipping. Monthly trend analysis smooths out noise from holidays, marketing campaigns, and seasonal variation. Update your internal benchmarks quarterly against published industry data.


Internal linking suggestions:

Distribution checklist:

  • Dev.to (canonical back to usertourkit.com)
  • Hashnode (canonical back)
  • Reddit r/SaaS, r/analytics (not r/reactjs — this is a metrics article, not a code tutorial)
  • Indie Hackers (onboarding metrics angle)

JSON-LD Schema:

{
  "@context": "https://schema.org",
  "@type": "TechArticle",
  "headline": "Retention analytics for onboarding: week 1 vs week 4 vs week 12",
  "description": "Measure onboarding retention at week 1, week 4, and week 12 with cohort analysis. Includes formulas, benchmarks, and Tour Kit code for tracking retention events.",
  "author": {
    "@type": "Person",
    "name": "DomiDex",
    "url": "https://github.com/DomiDex"
  },
  "publisher": {
    "@type": "Organization",
    "name": "Tour Kit",
    "url": "https://usertourkit.com",
    "logo": {
      "@type": "ImageObject",
      "url": "https://usertourkit.com/logo.png"
    }
  },
  "datePublished": "2026-04-11",
  "dateModified": "2026-04-11",
  "image": "https://usertourkit.com/og-images/retention-analytics-onboarding-week-1-week-4-week-12.png",
  "url": "https://usertourkit.com/blog/retention-analytics-onboarding-week-1-week-4-week-12",
  "mainEntityOfPage": {
    "@type": "WebPage",
    "@id": "https://usertourkit.com/blog/retention-analytics-onboarding-week-1-week-4-week-12"
  },
  "keywords": ["retention analytics onboarding", "onboarding retention curve", "week 1 retention benchmark", "saas onboarding cohort analysis"],
  "proficiencyLevel": "Intermediate",
  "dependencies": "React 18+, TypeScript 5+",
  "programmingLanguage": {
    "@type": "ComputerLanguage",
    "name": "TypeScript"
  }
}

Ready to try userTourKit?

$ pnpm add @tour-kit/react