Stop Letting Bot Clicks Lie To You: How To Build a ‘Human‑Only’ Click Metric For Every Short Link
You are not imagining it. Seeing 4,000 clicks in your link shortener and only 1,300 users in GA4 is enough to make any campaign report feel broken. Then someone asks, “So which number is right?” and suddenly a simple performance update turns into a trust exercise. The messy truth is that both tools can be telling the truth at the same time. Short links count every hit to the redirect. GA4 only counts what makes it through to a tracked page visit, with JavaScript running, cookies allowed, and the user not blocked by privacy tools. Add in email security scanners, chat app link previews, social prefetching, and bot traffic, and your click total starts getting padded by machines before a human ever lands. The fix is not to chase one perfect number. The fix is to create a separate, repeatable “human-only” click metric that gives you a cleaner baseline for campaign decisions and stakeholder reporting.
⚡ In a Hurry? Key Takeaways
- Shortener clicks and GA4 sessions measure different things, so they will never match exactly.
- Start a “human-only” metric by filtering out known bots, link previews, security scanners, and ultra-fast duplicate hits.
- You do not need new tools or a data team. Clear UTMs, a few rules, and consistent reporting can clean this up fast.
Why link shortener clicks not matching Google Analytics is so common
This problem shows up everywhere, but it gets especially ugly in paid social and email.
Why? Because those channels are full of non-human clicks that still look like activity to a shortener. Mail clients generate previews. Security tools test links before delivery. Messaging apps fetch the page to build a thumbnail. Social platforms sometimes pre-load a destination to speed up the experience.
Your shortener sees the redirect request and counts it. Fair enough.
GA4 is stricter. It usually needs the landing page to load, the analytics tag to fire, and the browser to allow tracking. If the visit stops at the redirect, blocks scripts, or never behaves like a real page view, GA4 may never count it.
That is why “link shortener clicks not matching Google Analytics” is not a bug by default. It is often a measurement gap between two different stages of the same journey.
What each tool is actually counting
Short link platforms
A link shortener usually counts a click when someone, or something, requests the short URL. That includes:
Humans on phones and laptops. Bots. Email security scanners. Slack or Teams previews. Social app prefetching. Duplicate refreshes. Sometimes even monitoring systems.
GA4
GA4 usually counts users and sessions only after the person reaches the destination page and the analytics setup works correctly. That means GA4 can miss traffic when:
JavaScript is blocked. Consent is declined. Ad blockers stop the tag. The page loads too slowly. The redirect chain breaks UTMs. Or the “visitor” was never a person to begin with.
The practical takeaway
Your shortener is top-of-funnel. GA4 is post-landing. They are related, but not identical.
Stop trying to force a perfect match
The wrong move is trying to beat the numbers into agreement.
The better move is to define three reporting layers:
1. Raw clicks
Everything the shortener recorded. Good for infrastructure volume and early warning signs.
2. Human-only clicks
Your cleaned-up click count after filtering obvious machine traffic and suspicious patterns.
3. Landed visits
GA4 users or sessions that actually made it to the page and got tracked.
Once you split reporting this way, the conversation gets easier. Raw clicks show total demand and noise. Human-only clicks estimate real interest. Landed visits show what reached the site successfully.
How to build a human-only click metric in an afternoon
You do not need a fancy fraud stack for this. You need a clear rule set.
Step 1: Keep your UTMs clean first
If your naming is sloppy, cleanup gets much harder. Before you touch bot filtering, make sure campaign tagging is consistent across teams. This is exactly why Stop Letting Random People Build Your UTMs: How To Take Back Control Of Short Links Across Your Whole Team is worth a read. If one team uses “paid-social” and another uses “paidsocial,” your reports start arguing before the traffic even arrives.
Step 2: Export raw click logs from your shortener
In Redirect My… or any similar platform, pull the most detailed click data available. You want fields like:
Timestamp. User agent. IP or hashed IP if available. Referrer. Country. Device type. Destination URL. UTM values. Redirect response details if offered.
Step 3: Remove known non-human user agents
This is the low-hanging fruit. Filter out user agents tied to:
Email scanners, uptime monitors, known crawlers, social preview bots, messaging app fetchers, and antivirus link checkers.
You do not need perfection here. Even a basic exclusion list can remove a large chunk of fake clicks.
Step 4: Flag impossible behavior
Machines often behave in ways people do not. Look for patterns like:
- Multiple clicks from the same IP and user agent within a few seconds
- Clicks at odd scale from one network block
- Zero referrer plus very high volume in a tiny time window
- Clicks that hit many campaign links in sequence
- Traffic from countries your campaign was not targeting
A real person might click twice. A scanner might click 40 links in one burst.
Step 5: Create a time-based deduplication rule
This one helps more than most marketers expect.
If the same IP and user agent hit the same short link within, say, 5 to 10 seconds, count it once for your human-only metric. Keep the raw total too, but do not treat every one of those hits as a separate person.
This rule catches refreshes, scanner retries, and prefetch-plus-open sequences.
Step 6: Compare against landed visits in GA4
Now line up your cleaned click count next to GA4 sessions or users for the same landing page and campaign tags.
You are not looking for a perfect 1:1 match. You are looking for a believable relationship. For many campaigns, human-only clicks should sit above GA4 landed visits, but not wildly above them.
If raw clicks are 4,000, human-only clicks are 1,900, and GA4 shows 1,300 users, that story makes sense. If raw clicks are 4,000 and human-only clicks are still 3,900 while GA4 is 1,300, your filter rules are too weak.
Rules of thumb for common channels
Email is a magnet for security scanning. If your click spike happens right after send time and before most humans would reasonably open, be suspicious. Corporate inboxes are especially noisy.
Paid social
In-app browsers, prefetching, and privacy restrictions can all distort the path from click to session. Expect a gap. Watch for bursts from platform-related infrastructure rather than broad user behavior.
Messaging apps
Slack, Teams, iMessage, and similar apps often create previews. A shared link can rack up “clicks” before anyone really visits.
Affiliate or partner traffic
This one needs extra care. Some inflation may come from quality issues, not just preview bots. If one source has a giant raw-to-human drop compared with others, dig deeper.
What to put in your stakeholder report
This is where people usually get stuck. The answer is simple. Show all three numbers, but label them clearly.
- Raw clicks: all redirect hits recorded by the shortener
- Human-only clicks: cleaned estimate after bot, preview, and duplicate filtering
- Tracked landings: GA4 users or sessions that reached the site and were measurable
Then add one sentence of plain English:
“Raw click totals include scanners, previews, and duplicate requests. Human-only clicks are our best estimate of real interest. GA4 shows the subset that reached the site and allowed analytics tracking.”
That sentence can save a lot of meeting time.
What not to do
Do not use raw shortener clicks as your main success metric
They are useful, but too noisy on their own.
Do not assume GA4 is always the “truth” either
GA4 can undercount because of consent settings, blockers, browser limits, and broken tags.
Do not change your rules every week
Pick a sensible filtering method and stick with it. Consistency matters more than chasing tiny improvements.
A simple framework you can reuse across platforms
Whether you use Redirect My… or another shortener, the framework is basically the same:
- Standardize UTMs and destination URLs
- Pull raw click data
- Exclude known bot and preview user agents
- Deduplicate rapid repeat hits
- Flag suspicious source patterns
- Compare cleaned clicks to GA4 landed visits
- Report raw, human-only, and landed numbers separately
That last point matters most. Separate metrics reduce arguments because each number has a job.
At a Glance: Comparison
| Feature/Aspect | Details | Verdict |
|---|---|---|
| Raw shortener clicks | Counts every redirect hit, including bots, previews, scanners, and humans. | Useful for volume, not reliable as a standalone performance metric. |
| Human-only clicks | Filtered and deduplicated clicks that aim to estimate real human interest. | Best middle-ground number for campaign reporting. |
| GA4 users or sessions | Shows visits that reached the site and successfully triggered analytics. | Best for on-site behavior, but can undercount due to blockers and consent. |
Conclusion
The last 24 hours have been full of confused threads and support tickets from marketers trying to figure out why their shortener says one thing and GA4 says another. That confusion is real, especially in paid social and email where previews and security scanners can hammer a URL before any person sees the page. The good news is you do not have to choose one tool and declare the other wrong. A practical human-only click metric gives you a cleaner way to judge campaign quality, defend your numbers in stakeholder meetings, and avoid turning off good campaigns because a noisy metric made them look broken. If you tag links consistently, apply a few sensible filters, and normalize reporting across Redirect My… and your analytics stack, you can build a repeatable system in an afternoon. No new software. No data team. Just better numbers and fewer awkward reporting conversations.