By SearchScore Team 27 March 2026 8 min read

Your Click Signals Are Now Your AI Citation Signals

Google has always used click behaviour to rank pages. New evidence shows those same signals now flow into AI-generated answers. If your pages aren't satisfying users after the click, they are being quietly filtered out of AI recommendations too.

Key Takeaways

In this article

For years, SEOs debated whether Google actually used click data to influence rankings. Google publicly denied it. The debate felt academic - interesting for practitioners but ultimately unresolved. Two events changed that in the past 12 months: a leaked batch of Google API documentation, and sworn testimony in the US Department of Justice antitrust trial against Google.

Both confirmed the same thing: Google tracks user behaviour after a search result is clicked, and that behaviour influences which pages rank. But the more significant implication - surfaced by @CyrusShepard at Loganix in detailed research published in March 2026 - is that those same click signals now flow through a mechanism called RankEmbedBERT directly into the AI answers that ChatGPT, Perplexity, and Google's own AI Overviews generate.

Your click signals are your AI citation signals. They have been for some time. Most marketers just don't know it yet.

What the Google API Leak Revealed

In mid-2024, a batch of internal Google API documentation leaked publicly. Among the thousands of variable names and module descriptions, three stood out for anyone paying attention to click-based ranking signals:

The existence of these variables confirmed what many in the SEO community had long suspected: Google is not just counting clicks. It is evaluating the quality of those clicks based on what happens after the user arrives on the page. A click that sends the user straight back to Google is a bad signal. A click that ends the search session entirely is a very good one.

Research credit: @CyrusShepard at Loganix for the click signal framework surfaced in this analysis. SearchScore data on AI citability outcomes layered on top.

There is also a fourth variable worth noting: unicornClicks. The definition is not public, but context within the leaked documentation suggests Google segments users into trust tiers, with "unicorn" representing a high-trust user whose click behaviour carries elevated weight. The implication is that not all clicks are equal - a click from a seasoned, high-activity Google user may carry more signal value than one from a low-engagement account.

The ABC Framework: Anchors, Body, Clicks

The leak provided internal variable names. The US v. Google antitrust trial provided something more powerful: sworn internal testimony.

During the trial, Google engineers and executives testified about how the core ranking system actually works. One acronym emerged that crystallised the whole framework: ABC - Anchors, Body, Clicks.

All three carry roughly equal weight in Google's core ranking calculation. Clicks are not a minor tweak or a tiebreaker - they are one of three pillars of the entire ranking architecture, sitting alongside the foundational signals that SEOs have optimised for since the beginning.

This also aligns with a long-standing Google patent titled "Modifying search result ranking based on implicit user feedback" - which described exactly this mechanism in theoretical terms years before the trial confirmed it was in active use at scale.

"Clicks aren't just a ranking signal. They're one of three primary ranking pillars - Anchors, Body, Clicks - confirmed under oath in the US v. Google antitrust trial."

RankEmbedBERT: The Bridge to AI Answers

Understanding that Google uses click signals for ranking is important but not new knowledge for most SEOs. The more significant finding is how those signals travel from search rankings into AI-generated answers.

The mechanism is RankEmbedBERT: a Google algorithm that combines click-based ranking signals with BERT-style language model embeddings. In practical terms, RankEmbedBERT takes the satisfaction signals associated with a page (good clicks, long dwell time, end-of-session behaviour) and uses them to weight that page's embedding representation within the model.

Pages with strong post-click satisfaction profiles receive higher embedding weights. Higher embedding weights mean greater probability of being drawn on when an AI system generates an answer to a related query.

This is the link that most marketing teams are missing. They think about AI visibility as a content problem - "is my content good enough for AI to cite?" But the deeper question is whether the people who find your content are actually satisfied by it. The AI doesn't just care whether your content exists. It cares whether your content works.

ChatGPT, when browsing the web, effectively defers to Google's ranked view of the web as a proxy for quality. That ranked view is shaped in significant part by click satisfaction signals. So those signals travel all the way from a user's browser behaviour into a ChatGPT response recommending your competitor instead of you.

Post-Click Satisfaction: The Signal That Actually Matters

Raw CTR - the percentage of users who click your result in the first place - is a limited signal. Two pages can have identical click-through rates but completely different satisfaction profiles. What Google measures, and what ultimately matters for both search rankings and AI citation probability, is post-click behaviour:

Signal What it tells Google AI visibility impact
Long dwell time The page held the user's attention Strong positive
No SERP return The user's query was resolved Strong positive
No query reformulation The user didn't need to refine their search Positive
Immediate bounce back The page didn't answer the query Strong negative
Paywall bounce Content was inaccessible - query unresolved Negative
Rapid query reformulation User had to try again - page failed them Negative

Paywalls deserve specific attention here. A page that ranks well but immediately presents a paywall or aggressive interstitial is highly likely to generate a bad click signal - the user sees a gate, bounces back to the SERP, and reformulates. That bounce is a direct negative input into the click signal model. Paywalled content may appear in search results but is quietly accumulating evidence that it doesn't satisfy user intent - and that evidence flows into AI citation probability via RankEmbedBERT.

18 vs 71
Average AI citability score for pages in the bottom quartile for user satisfaction signals (18/100) vs. pages passing all post-click satisfaction checks (71/100) - across 700,000+ sites scored by SearchScore.

This data from SearchScore's audit database illustrates the scale of the gap. It's not a marginal difference. A fourfold improvement in AI citability is the difference between being invisible to AI answers and being regularly cited. The pages that score well are not necessarily the best-written or most authoritative - they are the ones that users actually find satisfying to read. That is a different optimisation target than most content teams are currently working towards.

What Are unicornClicks - and Why Do They Matter?

The unicornClicks variable in the API leak has attracted less attention than badClicks and goodClicks, but it may be the most interesting of the three. The variable name suggests Google segments users into trust tiers, with "unicorn" representing some form of high-trust or high-signal user classification.

The exact definition is not public. But the implication is significant: not all clicks carry the same weight. A click from a user Google classifies as a unicorn - presumably someone with a strong search history, genuine intent signals, and a track record of engagement - may carry more ranking weight than a click from a low-activity or ambiguous account.

This matters for AI visibility because it suggests that the quality of engagement - not just the quantity - is what drives the click signal model. A handful of deeply satisfying visits from genuinely engaged users may outperform thousands of low-engagement clicks. For content strategy, this reinforces the case for depth and specificity over breadth and volume.

What This Means for Your AI Visibility Strategy

The practical implications run across your entire marketing and engineering stack - not just SEO. Here is where to focus:

1. Audit your post-click experience

Review your bounce rates, dwell times, and scroll depth by page. Pages with high bounce rates and short sessions are accumulating bad click signals - and those signals are suppressing both search rankings and AI citation probability. Investigate whether the issue is content quality, page speed, or a mismatch between the search intent your page attracts and the content it delivers.

2. Remove or rethink paywall gates

If your highest-traffic content sits behind a paywall or aggressive email gate, you have a structural post-click satisfaction problem. Users who cannot access the content signal dissatisfaction. Consider whether ungating your most valuable informational content and monetising differently (via lead generation, upsell funnels, or brand authority) outweighs the direct revenue from gating.

3. Match content to the specific search intent that attracted the click

A user who searches for a specific answer and lands on a general overview article will bounce. Aligning your content precisely to the query intent that brings users to the page reduces pogo-sticking and increases dwell time. This is more granular than traditional keyword targeting - it requires reviewing actual queries in Search Console and ensuring the content directly addresses them.

4. Improve page performance for sustained engagement

A slow page is a shortened session. Core Web Vitals affect not just ranking but the probability that a user stays long enough to generate a positive click signal. Prioritise Largest Contentful Paint under 2.5 seconds and Cumulative Layout Shift under 0.1 for every page that carries meaningful organic traffic.

5. Structure content for clarity, not just length

Walls of unbroken text increase scroll abandonment. Clear headings, short paragraphs, bullet lists for scannable information, and an explicit answer to the question early in the page all contribute to the kind of reading behaviour that generates strong dwell signals. This is not just good UX - it is now directly connected to AI citation probability via the click signal chain.

Key implication for engineering teams: Click signals are not just an SEO or content concern. Page speed, JavaScript render performance, paywall logic, interstitial placement, and scroll depth tracking all directly affect the click signals your pages generate. This is now an engineering priority, not just a content one.

Check your AI citation signals now

SearchScore scores your site across six AI visibility categories - including ai_citability and platform_opt, the two most directly connected to post-click satisfaction. You'll see exactly where your signals are weak and what to fix first. Free, takes 30 seconds.

Run your free SearchScore audit →

Frequently Asked Questions

What are Google click signals and do they affect AI?

Google click signals are behavioural data points collected when users interact with search results. The Google API leak exposed variable names including badClicks, goodClicks, and lastLongestClicks, confirming that Google tracks not just whether a user clicks a result, but how satisfied they were after the click. These signals influence search rankings - and via the RankEmbedBERT algorithm, they now appear to influence AI-generated answers as well, since AI systems like ChatGPT draw on Google-ranked content as a source of ground truth for what constitutes a quality page.

What is RankEmbedBERT?

RankEmbedBERT is a Google algorithm that combines click-based ranking signals with BERT-style language model embeddings. It acts as the bridge between traditional search signals (including user click behaviour) and the AI systems that generate answers. Pages that receive strong post-click satisfaction signals - long dwell time, no immediate return to the SERP, no follow-up query reformulation - are weighted more heavily in the embeddings that AI models draw from when generating answers.

How does dwell time affect AI citations?

Dwell time - the amount of time a user spends on a page after clicking a search result - is one of the key post-click satisfaction signals Google measures. Long dwell time signals that the page answered the user's question. Short dwell time (the pogo-stick pattern) signals the opposite. Via RankEmbedBERT, pages with strong dwell time signals receive higher embedding weights, making them more likely to be cited in AI-generated answers. Paywalls, thin content, and slow page loads all reduce dwell time and therefore hurt AI citation potential.

How do I check my AI citation signals?

Run a free audit at SearchScore. It scores your site 0-100 across six AI visibility categories, including ai_citability and platform_opt - the two categories most directly connected to post-click satisfaction signals. You'll receive a prioritised fix list showing exactly which signals are weak and what to do about them. The whole audit takes 30 seconds.

Related articles

Sources & Further Reading