GEO

I Audited My Website for AI Search Visibility — Here's What I Found

· 9 min read

Last month, I did something I should have done a year ago: I typed my own business into ChatGPT and Perplexity and asked them to recommend a Slack bot developer in Madrid.

Neither mentioned me.

Not once. Not in any variation of the question.

This bothered me more than it probably should have. I rank well in Google. I have case studies, testimonials, a clear service description. But to the AI systems that 50 million people now use to find services like mine, I effectively don't exist.

So I ran a full audit. Here's everything I found — and the 8 fixes I implemented in a single day.

Why This Matters More Than You Think

We're in the middle of a search behavior shift that most business owners haven't noticed yet. Google search volume is flat or declining in several categories. Meanwhile, ChatGPT now handles over 1 billion searches per week. Perplexity reached 100 million users faster than TikTok did.

People are no longer just Googling for "Slack bot developer Madrid." They're asking ChatGPT: "I need a Slack bot that creates Jira tickets automatically — who builds this kind of thing?"

That's a fundamentally different query. And it produces a fundamentally different kind of result — a synthesized answer that cites specific sources, not a ranked list of ten blue links.

The business that gets cited in that answer wins the conversation. The others don't even appear.

The Test That Started Everything

I opened Perplexity and typed: "Who are good Slack bot developers for B2B companies?"

It gave me three names. None were me.

I tried variations: "Slack Jira integration developer," "AI automation consultant Madrid," "small business automation consultant." Still nothing.

Then I tried ChatGPT. Same result. I simply wasn't being cited.

What I found when I started digging was that the problem wasn't my content quality. It was a series of technical and structural gaps that made my site nearly impossible for AI systems to process and cite correctly.

The 8 Problems I Found

Here's what the audit revealed, roughly in order of impact:

Problem 1: No llms.txt file. This is the AI equivalent of a sitemap. It's a plain-text file at /llms.txt that tells AI crawlers what your site is about, what services you offer, and what pages matter. Without it, AI systems have to guess — and they often guess wrong.

Problem 2: AI crawlers were in a grey area. My robots.txt wasn't explicitly blocking AI bots, but it wasn't explicitly allowing them either. GPTBot, ClaudeBot, PerplexityBot, and Google-Extended all need to be explicitly permitted. Ambiguity means some crawlers skip you.

Problem 3: No structured data (schema.org). My service pages had no machine-readable description of what I do, who I am, or what I charge. When AI systems try to extract facts about a business, schema markup is the most reliable source. Without it, they're parsing prose — which is much less reliable.

Problem 4: No FAQ schema on service pages. AI systems love Q&A format because it maps directly to how they respond to user queries. "What does this service cost?" "How long does implementation take?" If those answers exist in FAQ schema, an AI can extract and cite them precisely.

Problem 5: Content lacked specific, citable facts. AI citation works differently from Google ranking. Google rewards pages that are authoritative and well-linked. AI systems reward pages that contain specific, extractable claims: numbers, timeframes, outcomes, prices. My service pages described what I do in general terms. They didn't say "I typically deliver in 2–3 weeks" or "clients typically see 60–80% reduction in manual ticket routing."

Problem 6: No freshness signals. AI systems weight recent information more heavily. My pages had no <time> elements, no "Updated March 2026" markers. To a crawler, content with no date looks old — which deprioritizes it in citations.

Problem 7: Homepage H1 was generic. My hero headline was "Technology That Works. Results You Can Measure." Sounds fine to humans. To AI crawlers trying to categorize my business, it says almost nothing. I changed it to "Slack Bots & AI Automation. Results You Can Measure." — immediately more citable.

Problem 8: No social proof in structured format. I had client testimonials scattered through the page as plain text. AI systems can't reliably extract unstructured testimonials. Adding Review and AggregateRating schema made the social proof machine-readable.

The Fixes — Before and After

Here's what I actually changed:

Issue Before After Effort
llms.txt Missing Created — services, prices, case studies, key facts 30 min
robots.txt Silent on AI bots Explicit Allow for 7 AI crawlers 10 min
Schema markup None Service, FAQPage, Review, HowTo, BreadcrumbList 2 hrs
Homepage H1 "Technology That Works" "Slack Bots & AI Automation" 5 min
Specific facts Vague benefits copy Timelines, prices, outcomes in service pages 1 hr
Freshness signals No dates on pages <time datetime="2026-03-10"> in hero 10 min
Social proof Plain text testimonials Review + AggregateRating schema 45 min
Cache headers max-age=0 on key pages 3,600s cache on all service/blog URLs 20 min

Total: about 5 hours of focused work. No developer needed — just HTML, a text file, and knowing what to look for.

What Changed After

I'm not going to pretend the results were instant. AI search citation isn't like PPC — you can't turn on a switch and see clicks the next morning. The crawlers need to re-index your site, and that takes a few weeks.

But the structural work matters a lot. Here's why:

When Perplexity or ChatGPT answers a query about Slack bot developers, they're pulling from a combination of real-time web crawling and their training data. For the real-time crawl to work in your favor, your site needs to be crawlable, parseable, and citable. That's what this audit fixes.

Think of it like this: before the audit, my site was a locked room. All the good stuff was inside, but AI systems couldn't get in. After the audit, the door is open, there's a welcome sign, and there's a clearly labeled cabinet with exactly the information they need.

The Checklist I Wish I'd Had

If you want to run this audit yourself, here are the 8 things to check:

  1. Test yourself in Perplexity and ChatGPT — search for your service category. Do you appear? If not, start here.
  2. Check robots.txt — explicitly allow GPTBot, ClaudeBot, PerplexityBot, Google-Extended, OAI-SearchBot, Applebot-Extended.
  3. Create llms.txt — plain text file at /llms.txt describing your business, services, and key pages. Think of it as a brief for AI systems.
  4. Add Organization schema — machine-readable description of your business, services, and location.
  5. Add FAQPage schema to service pages — 4-6 questions with precise answers. "What does X cost?" "How long does X take?"
  6. Make your content citable — replace vague benefits copy with specific claims: numbers, timelines, outcomes, prices.
  7. Add freshness signals — use <time datetime="..."> elements. Update service pages quarterly.
  8. Add Review/AggregateRating schema — structured social proof that AI systems can extract and cite.

One More Thing

The biggest mental shift from this audit wasn't technical. It was realizing that AI search and Google search require fundamentally different optimization strategies.

Google rewards links and authority. AI systems reward clarity and citability. A page that ranks #1 on Google might be invisible to ChatGPT if it's not written in a way that lets AI systems extract and paraphrase its key claims.

The good news: the technical fixes are not hard. If you can edit HTML, you can do most of this yourself in a day. The hard part is knowing what to look for — which is exactly what this audit taught me.

I build Slack bots, AI classifiers, and automation pipelines for B2B teams. If you want to talk about automation — not AI search — my calendar is here.

Related Service

AI Search Optimization (GEO)

I run the same audit on your site — robots.txt, llms.txt, schema markup, content citability — and implement the fixes. Starting from $1K for audit-only, $2K for audit + implementation.

Learn more →

Related Posts

7 Things to Fix So ChatGPT Can Find Your Business

The non-technical GEO checklist for B2B owners.

Traditional SEO Won't Get You Found by AI

Why ranking on Google no longer means being cited by AI.

Evgeny Goncharov - Founder of TechConcepts, ex-Yandex, ex-EY, Darden MBA

Evgeny Goncharov

Founder, TechConcepts

I build automation tools and custom software for businesses. Previously at Yandex (Search) and EY (Advisory). Darden MBA. Based in Madrid.

About me LinkedIn GitHub
← All blog posts