8 min read

Why Your Website Isn't Showing Up in ChatGPT (And How to Fix It)

Based on data from 700,000+ website audits, the average AI search visibility score is 34 out of 100. The majority of websites are invisible to ChatGPT, Perplexity and Google AI Overviews - and most business owners have no idea. Here are the six specific reasons why, and what to do about each one.

Key Finding

In our analysis of 700,000+ websites, the average AI visibility score is just 34/100. Six fixable technical and content issues account for the vast majority of poor AI search visibility - and most businesses have at least three of them.

34
Average AI visibility score out of 100 across 700k+ audits
700k+
Websites analysed by SearchScore
6
Fixable reasons most sites are invisible to AI search

Why most websites are invisible without knowing it

When ChatGPT answers a question, it doesn't search the internet in real time the way Google does. It uses a combination of its training data and - for tools like ChatGPT Search and Perplexity - live web crawlers that read websites and decide which sources to cite.

Both processes have gatekeeping criteria. Websites that don't meet them get skipped. The problem is that these criteria are different from the ones Google uses to rank pages. A website can be technically excellent for Google - fast, mobile-optimised, rich in links - and still be completely invisible to AI tools because of a handful of missing signals that Google doesn't care about but AI does.

Most business owners assume that if they rank on Google, they're doing fine. That assumption is now dangerously incomplete. AI search and traditional search are separate systems with separate requirements.

The 6 reasons your website isn't showing up in ChatGPT

Reason 1

AI crawlers are blocked in your robots.txt file

Your website has a file at yoursite.com/robots.txt that instructs automated visitors - including search bots - what they're allowed to access. Many websites were configured years ago with broad restrictions that block all automated visitors, or were set up by developers following old security guidelines.

The result: GPTBot (ChatGPT's crawler), PerplexityBot, and Google-Extended (for AI Overviews) simply cannot read your website. Everything else is irrelevant if this is the case - you cannot appear in AI search results from a site the crawler cannot enter.

In our audit data, a significant proportion of business websites have at least one major AI crawler blocked. Most have no idea this setting exists.

The Fix

Open your robots.txt file in a browser. Look for Disallow: / under User-agent: * or explicit rules blocking GPTBot, PerplexityBot, or Google-Extended. Ask your developer to add explicit allow rules for these crawlers. This takes under an hour and has immediate impact.

Reason 2

You have no structured data (schema markup)

Humans read your homepage and understand what you do within seconds. AI systems need machine-readable labels to reach the same understanding with confidence. These labels are called structured data or schema markup - they're code snippets embedded in your website that explicitly declare your business name, type, address, phone number, opening hours, services, and more.

Without schema markup, an AI tool reading your website must infer all these facts from your text alone - a process that's unreliable and frequently results in incomplete or incorrect representation. With schema markup, there's no ambiguity. The data is machine-readable and directly consumable.

For most businesses, LocalBusiness schema is the priority. For e-commerce, Product schema. For content sites, Article schema. The right type depends on your business, but the absence of any schema at all is one of the most consistent findings in low-scoring SearchScore audits.

The Fix

Use Google's Rich Results Test (search.google.com/test/rich-results) to check what schema, if any, your site currently has. Then implement LocalBusiness schema at minimum, with your name, address, phone, hours, geo coordinates, and service area. A developer can add this in a few hours. Use a JSON-LD block in the <head> of each relevant page.

Reason 3

Your business has weak entity signals

AI search tools don't evaluate your website in isolation. They cross-reference everything they know about your business across the entire web to assess credibility and legitimacy. A business with a well-designed website but almost no external presence - no Google Business Profile, few reviews, minimal directory listings, no press mentions - looks like a thin, unverified entity.

Entity signals are the external footprint that makes your business legible to AI systems. They include: reviews on Google, Trustpilot, and industry platforms; listings in relevant business directories; mentions and citations in third-party content; a complete and active Google Business Profile; and consistent brand information across all platforms.

Weak entity signals don't just reduce your AI search visibility - they can cause AI tools to generate inaccurate information about your business, because they don't have enough verified data to draw from.

The Fix

Audit your external presence: check your Google Business Profile is complete and accurate, ensure you're listed in at least 10 relevant directories, actively collect reviews from customers, and identify any inaccurate information about your business appearing on third-party sites. Treat entity building as an ongoing monthly habit, not a one-off task.

Reason 4

Your content is too thin or vague for AI to extract answers

AI search tools are designed to find pages that directly answer questions. When someone asks ChatGPT "who provides accountancy services for small businesses in Manchester?", the tool is looking for a page that states clearly and specifically what the business does, where it operates, and who it serves - not a page full of marketing language about being "passionate about numbers" and "delivering exceptional client outcomes."

Thin content means content that circles around what you do rather than stating it plainly. AI tools extract direct, factual statements. If your homepage doesn't contain a sentence along the lines of "[Business name] provides [specific service] in [specific location], serving [specific customer type]", you're not giving AI anything concrete to cite.

This is one of the most common and easily fixable issues in our audit data. The fix doesn't require a full website rewrite - it requires ensuring every key page opens with a direct, specific statement of what the business does.

The Fix

Review your homepage and top service pages. Does the opening paragraph state clearly what you do, where you do it, and who for? Add a "Local Answer Bridge" sentence to each key page: "[Business] provides [service] in [city], typically starting at [price]." This is the exact sentence structure AI tools extract for citations.

Reason 5

Your NAP data is inconsistent across the web

NAP stands for Name, Address, Phone number. AI systems cross-reference your business information across multiple sources to verify it's accurate and trustworthy. If your business is listed as "Harrison Consulting Ltd" on your website, "Harrison Consulting" on Google, and "Harrison Consulting Services" on Yell.com, those discrepancies create doubt - and AI tools default to citing sources they can verify confidently.

Inconsistent NAP data is particularly damaging for local service businesses, where location and contact details are among the primary signals AI tools use to identify and recommend you. Even small variations - different phone number formats, abbreviated versus full address, trading name versus registered name - can fragment your entity signals and reduce citation frequency.

The Fix

Conduct a NAP audit: search your business name on Google and check every listing that appears. Document what each source shows for your name, address, and phone number. Update any that differ from your canonical version. Use exactly the same spelling, capitalisation, and format everywhere - your website, Google Business Profile, Bing Places, Apple Maps, and every directory you appear in.

Reason 6

You have no llms.txt file

An llms.txt file is a plain text document you publish at yoursite.com/llms.txt that gives AI systems a structured summary of your business - what you do, your most important pages, and how to understand your content. Think of it as a welcome note for AI visitors: instead of making the AI piece together your business context from scattered web pages, you give it a clear, authoritative starting point.

The llms.txt standard is relatively new, which means the vast majority of websites don't have one. That creates a genuine competitive advantage for businesses that add it now - AI tools reading your site will have immediate, unambiguous context that most competitors aren't providing.

An llms.txt file typically includes your business name and description, your key service pages with brief summaries, your location and contact details, and any specific context that helps AI understand your positioning and audience.

The Fix

Create a plain text file and publish it at yoursite.com/llms.txt. Include: a one-paragraph description of your business, a list of your key pages with brief descriptions, your location and contact details, and the key questions your business answers. This takes under 30 minutes and immediately improves how AI tools understand and represent your business.

Find out which of these issues your site has

Free SearchScore audit. Instant results. No sign-up required.

Get My Free AI Visibility Readiness Score →

How to self-diagnose your AI search visibility

Before running a full audit, you can do a quick manual check in about five minutes:

If you find issues on more than two of these checks, your AI visibility is likely poor. The average score in our audit database is 34 out of 100 - which means most businesses have significant room for improvement, and significant competitive advantage available to those who act.

For a comprehensive breakdown that scores you across all six AI visibility categories - AI citability, brand authority, EEAT content signals, technical accessibility, structured data, and platform optimisation - run a full SearchScore audit.

The priority order for fixes

If you've found multiple issues, here's how to sequence the fixes for maximum impact:

Priority Fix Estimated time Impact
1 Unblock AI crawlers in robots.txt 30 minutes Critical - nothing else matters if crawlers are blocked
2 Add LocalBusiness schema markup 2-4 hours High - immediate improvement in AI understanding
3 Create an llms.txt file 30 minutes High - fast win with very low effort
4 Fix NAP consistency 2-4 hours Medium-high - important for local businesses
5 Rewrite key pages for direct content 1-2 days Medium-high - sustained improvement over time
6 Build entity signals (reviews, listings) Ongoing High long-term - compounds over months

More in this series

S

GEO Research & Analysis

The SearchScore editorial team researches and writes about generative engine optimisation, AI search visibility and the signals that determine whether your website gets cited by ChatGPT, Perplexity and Google AI Overviews. Our findings are based on data from 700,000+ website audits.

Sources & Further Reading

Frequently Asked Questions

Why is my website not showing up in ChatGPT?

The six most common reasons are: AI crawlers are blocked in your robots.txt file, you have no structured data schema markup, your business has weak entity signals (reviews, directory listings, external mentions), your content is too thin or vague for AI to extract useful answers, your brand name and contact details are inconsistent across the web, and you have no llms.txt file to guide AI systems. Most sites have at least three of these issues. The average AI visibility score across 700,000+ sites audited by SearchScore is 34 out of 100.

How do I check if ChatGPT can find my website?

You can do a manual check by asking ChatGPT or Perplexity questions about your products or services and seeing whether your business is cited. For a technical breakdown, run a free SearchScore audit at searchscore.io - it analyses your robots.txt, structured data, entity signals, content quality, and other AI visibility factors in about 60 seconds.

What is the difference between SEO and AI search optimisation?

Traditional SEO optimises your website to rank on Google's search results page. AI search optimisation (also called GEO - Generative Engine Optimisation) optimises your website to be cited and quoted by AI tools like ChatGPT, Perplexity, and Google AI Overviews. The two disciplines share some foundations (clear content, technical accessibility) but have different specific requirements. A site can rank highly on Google while being essentially invisible to AI search tools.

How long does it take to fix AI search visibility issues?

The three highest-impact technical fixes - unblocking AI crawlers in robots.txt, adding schema markup, and creating an llms.txt file - can typically be completed by a developer in half a day. Fixing NAP consistency across directories takes a few hours of methodical work. Building entity signals through reviews and mentions is an ongoing effort, but you can make meaningful progress within a few weeks. Most businesses see measurable improvement in their SearchScore audit results within 30 days of making these changes.

What is an llms.txt file and do I need one?

An llms.txt file is a plain text file you add to your website (at yoursite.com/llms.txt) that gives AI systems a structured summary of what your business does, your key pages, and how to understand your content. It is the AI equivalent of a robots.txt file - except it guides AI tools rather than restricting them. Most websites do not have one, which means AI systems have to infer your business context from your web pages alone. Adding an llms.txt file is one of the fastest wins available for AI search visibility.

Does my Google ranking affect whether I appear in ChatGPT?

Not directly. ChatGPT and other AI search tools use their own crawlers and ranking signals, which are different from Google's. A site can rank on page one of Google while being invisible to ChatGPT, and vice versa. AI visibility depends on technical accessibility (robots.txt), structured data, entity credibility, and content clarity - factors that overlap with but are distinct from Google ranking signals.

Check your AI visibility

Enter your URL at SearchScore for a free AI visibility score out of 100. See how ChatGPT, Perplexity and Google AI see your site - and exactly what to fix.