The AI Visibility Gap Is Getting Wider
866,000+ websites audited. Average score: 34/100. The more of the web we scan, the worse the picture gets.
In March we audited 350,000 websites. Now we have 866,000. The headline figure has dropped from 41.4 to 34 average. This is not a scoring change – it is the long tail of the web being measured for the first time. Most websites are not built to be found by AI. The gap between those that are and those that are not is structural, and it is widening.
Good SEO Does Not Mean Good AI Visibility. The Data Proves It.
- 866,000+ websites audited – 2.5x the March dataset. Average score fell from 41.4 to 34/100.
- 74.2% are Invisible or Low Visibility. Only 0.2% are AI-Ready.
- Technical scores average 70.1. AI Platform Readiness averages 34.1. Sites are technically healthy but AI-blind.
- WSJ scores 12. Reddit scores 13. A CBD oil retailer in Scotland scores 74. Size is not the advantage – structure is.
What is AI search visibility? AI engines synthesise one answer from sources they trust. They do not rank pages – they select and cite sources. AI visibility measures your likelihood of being selected. The signals are different from traditional SEO. Most businesses have never measured them.
More Data, Worse Picture. The Long Tail Is AI-Blind.
We doubled our dataset in 30 days. The average score dropped 7 points. This is what the real web looks like.
The average score dropped from 41.4 in March to 34 in April. This reflects the real web, not a scoring change. As we audit more sites, we reach deeper into the long tail – where most businesses live and most AI-readiness work has never been done.
The businesses you compete with for customers are also competing for AI citations. Most of them are invisible. The first to fix this wins their category.
Sites Are Technically Healthy. But AI-Blind.
Technical foundations average 70.1/100. AI Platform Readiness averages 34.1. A 36-point gap between "working website" and "AI-visible website".
Technical average
Sites load fast, have sitemaps, canonical tags, HTTPS, and clean crawl paths. The web is technically competent.
AI Platform Readiness
Most sites lack IndexNow, Bing verification, answer-first content patterns, and AI-specific configuration signals.
The gap
36 points between "technically healthy" and "AI-visible". Good SEO is necessary but not sufficient for AI search visibility.
Businesses that invested in technical SEO are not automatically ahead. AI visibility requires a different layer of optimisation. The technical foundation is there – the AI signals are not.
AI Cannot Understand What Most Businesses Do.
Structured data averages 23.1/100 across the dataset. It is the lowest-scoring category – and the most fixable.
Average structured data score
Structured data is how AI identifies what a business does, who it serves, and why it should be trusted. Most sites provide none of this context in machine-readable form.
Lack Organisation schema
Organisation schema is the basic identifier that tells AI who you are – name, type, location, services. Absent from roughly 9 in 10 websites.
Time to fix
Adding JSON-LD schema markup takes hours, not weeks. It is the highest-leverage action available to most businesses. Yet it remains undone.
Structured data is the difference between AI guessing what you do and AI knowing what you do. At 23.1/100, most sites are forcing AI to guess.
Structured data is the single highest-leverage fix available to most websites. It is technically simple, takes hours to implement, and directly determines whether AI can understand and recommend your business.
A Small Business in Scotland Outscores the Wall Street Journal.
AI visibility is determined by structure, not scale. Brand recognition, audience size, and domain authority are irrelevant to AI citation decisions.
cbdoilscotland.com is a small UK retailer with a fraction of the traffic and brand recognition of the sites it outscores. Its AI visibility advantage is structural – proper schema markup, E-E-A-T content signals, and AI crawler access – not budget-driven.
AI search does not know or care that the Wall Street Journal is famous. It cares whether the site provides machine-readable evidence of what it does and why it should be trusted. Structure wins.
Eight Signals Determine AI Visibility
SearchScore evaluates each website across 130+ checks in 8 weighted categories. Scores reflect the criteria AI engines use when selecting sources to cite.
Avg: 58.8. How directly your content answers AI queries. Includes llms.txt, bot access, citations and quotable statistics.
Avg: 32.5. External signals establishing your entity as credible. Wikipedia, LinkedIn, social presence, reviews.
Avg: 36.9. Evidence of experience, expertise, authoritativeness and trust. Author bios, bylines, contact info, fact sources.
Avg: 70.1. The strongest category. Crawlability, HTTPS, sitemaps, canonical tags – most sites do this well.
Avg: 23.1. The weakest category. JSON-LD schema that identifies who you are and what you do. Most sites have none.
Avg: 46.5. OpenGraph, Twitter Cards, RSS feed, video content. How well your content travels across platforms.
Avg: 51.5. Content hub depth, internal linking, structured headings. Signals that establish subject matter expertise.
Avg: 34.1. IndexNow, Bing verification, answer-first content, Perplexity access. Direct AI platform integration signals.
What's Changed in 30 Days
The overall average has dropped as we've audited more of the long tail. But within the sites we track consistently, we're seeing movement.
As AI search adoption grows, the cost of AI invisibility increases. Each month we do not publish a new report, the gap between early movers and the rest compounds further.
Reputation Does Not Transfer to AI
Globally recognised brands in our April dataset. Scale, budget and domain authority provide no advantage in AI search visibility.
| Brand | Score | Tier | Category |
|---|---|---|---|
| WSJ.com | 12 | Invisible | News & Media |
| The Sun (thesun.co.uk) | 12 | Invisible | News & Media |
| Reddit.com | 13 | Invisible | Social Platform |
| Betfair.com | 15 | Invisible | Betting & Gaming |
| Lady Gaga (ladygaga.com) | 19 | Invisible | Artist / Entertainment |
| Fiverr.com | 20 | Invisible | Marketplace / SaaS |
| Sky Bet (skybet.com) | 22 | Invisible | Betting & Gaming |
| cbdoilscotland.com | 74 | Strong | Health / Retail (UK) |
Scores reflect each website's AI visibility signals at time of audit. Scores change as sites are updated. cbdoilscotland.com is included to demonstrate that small businesses with well-structured sites routinely outperform globally recognised brands.
Five Things to Do Now
Based on the most common gaps across 866,000+ audited websites.
Most businesses do not know where they stand. Run a free audit at searchscore.io. Benchmark against your industry average. You cannot fix what you have not measured.
JSON-LD structured data is the single highest-leverage change available to most sites. It tells AI exactly who you are, what you do, where you are, and why you should be trusted. Average structured data score: 23.1. Fix this first.
Place a plain-text file at yourdomain.com/llms.txt listing your site's name, purpose, and key pages. It is the emerging standard for AI engine discovery. Takes 20 minutes. Adoption remains under 25%.
Verify GPTBot, ClaudeBot, PerplexityBot, and GoogleExtended are not blocked in your robots.txt. Legacy bot-blocking rules frequently exclude AI crawlers unintentionally. A blocked crawler cannot cite your site.
AI systems extract from the first clear, direct sentence. Add a one or two-sentence definition immediately after your H1 on service and product pages. "SearchScore is a tool that audits websites for AI search visibility." That structure is what AI selects and cites.
Key Findings & Quotable Stats
Copy-ready soundbites for press, presentations, and social media.
The core finding: AI search visibility is an unaddressed gap across every industry. The businesses that fix this in 2026 will hold citation advantages that compound for years.
Methodology & Citation
About the SAVI Report
The SAVI (State of AI Visibility Index) Report is published monthly by SearchScore. This second edition analyses 866,301 websites audited between January and April 2026. Sites are submitted organically by users of the free SearchScore tool, providing a real-world cross-section of the web rather than a curated sample.
Each site is evaluated across 130+ signals in 8 categories: AI Citability (25%), Brand Authority (20%), E-E-A-T Content (20%), Technical Foundation (15%), Structured Data (10%), Platform Optimisation (8%), Topical Authority (10%), and AI Platform Readiness (12%). Scores reflect the criteria AI engines use when selecting sources to cite, not Google ranking factors.
The average score dropped from 41.4 in the March edition to 34 in April. This reflects the expansion of the dataset into the long tail of the web, not a change in scoring methodology. Scores for the same sites are consistent across editions.
Full methodology: searchscore.io/methodology/
Get your industry's full breakdown
We'll send you the detailed AI visibility data for your specific industry – avg scores, top performers, quick wins, and where your sector is falling behind.
No spam. We'll send your report within 24 hours.