Is Your Website Invisible to AI?

ChatGPT, Claude, Perplexity, and Gemini are answering millions of questions about your industry. Find out if technical issues could be preventing your site from appearing in these AI systems.

Free to try (beta) • No signup • No credit card

Before you optimize for AI, make sure AI can actually reach you.

TTFB
1,850ms
Target: <600ms
⚠️ Past timeout wall
📐 CLS
0.18
Target: <0.1
⚠️ Parsing issues
🖱️ INP
420ms
Target: <200ms
❌ Agent friction
1-2 sec
The timeout wall
600ms
Target TTFB for AI
Free
Always free to use

What This Tool Does

This tool identifies technical issues that can prevent AI systems from accessing or using your content.

Performance Signals

We measure TTFB, CLS, and INP using Google's Chrome User Experience Report (CrUX). Faster responses and stable pages reduce fetch failures and increase the likelihood your content is used in AI retrieval workflows.

URL-level & Origin-level • CrUX data may be unavailable for low-traffic sites

🤖

AI Bot Access

We check your robots.txt against 33 AI crawlers (GPTBot, ClaudeBot, PerplexityBot, and more) to see which ones you're allowing or blocking. Robots.txt is advisory guidance — not all crawlers honor it.

Origin-level (site-wide)

🔍

Content Visibility

We compare content visible after JavaScript runs against what's present in raw HTML. If key text is missing from HTML, many crawlers and extractors won't capture it.

URL-level

📄

PDF Report

Export a complete technical audit you can share with developers, clients, or stakeholders.

Built for Teams Managing Website Visibility

🎯

SEO Professionals & Agencies

Add AI visibility audits to your service offering. The PDF report makes it easy to explain technical issues to clients who don't need to understand robots.txt syntax.

📈

In-House Marketing Teams

Stop wondering if your technical setup is hurting discoverability. Get specific issues you can hand to your dev team.

📋

Digital Consultants

Audit multiple client sites quickly. The reports document problems clearly enough to justify fixes.

💡

Founders & SMB Owners

You don't need to be technical to understand the results. We translate the problems into plain language and tell you what matters.

How It Works

1

Enter Any URL

Your homepage, product pages, blog posts — whatever you want to check.

2

We Run the Analysis

Performance tests, bot access checks, and content visibility comparisons run in parallel.

3

Review the Findings

Results appear across organized tabs covering performance metrics, crawlability, and content visibility.

4

Download the Report

Export everything as a PDF you can send to developers or include in client deliverables.

We Don't Optimize. We Diagnose.

This tool doesn't track your rankings in AI search results or monitor how often ChatGPT mentions your brand.

It answers one question: Could technical issues be preventing AI systems from accessing your content?

If your robots.txt file restricts GPTBot, we tell you. If your JavaScript hides content from crawlers, we show you what's missing.

What you do with that information is up to you.

Trusted Data Source

Chrome UX Report (CrUX)

Official Google dataset with real user measurements from millions of websites

75th Percentile (p75)

Industry standard metric used by Google for Core Web Vitals scoring

Updated Regularly

CrUX data is updated monthly with fresh user experience measurements

Andre Guelmann - SEO Expert

Built by an SEO Expert

Hi, I'm Andre Guelmann, an SEO & Website Growth expert with over 15 years of experience helping companies improve their online visibility.

As AI search engines began dominating how people find information, I noticed a critical gap: traditional SEO metrics don't capture AI visibility. Speed isn't just about ranking anymore — it's about being included at all.

I built this tool to help website owners understand their AI visibility and take action before they become invisible to the next generation of search.

Frequently Asked Questions

Is this really free?

Yes, for now during beta. No signup required, no credit card, no strings. Run as many checks as you want.

Which AI systems do you check?

We analyze access rules for 33 AI crawlers including GPTBot (ChatGPT), ClaudeBot (Claude), PerplexityBot (Perplexity), Google-Extended (Gemini), and others.

What if my site has low traffic?

Performance data comes from Google's Chrome User Experience Report (CrUX), which is aggregated real-user field data. Low-traffic sites may not have enough data, which we'll flag as "No data available."

Do robots.txt rules guarantee crawlers will be blocked?

No. Robots.txt is advisory guidance that crawlers are requested to honor — it's not access control. Crawler behavior varies by provider, and not all respect these rules.

Do you store my data?

No. We don't save URLs, results, or any site data. Each check runs fresh.

How long does an analysis take?

Usually 15-30 seconds depending on your site's response time and the checks involved.

Can this hurt my site?

No. We make standard requests just like any other crawler. Nothing invasive.

What does "URL-level" vs "Origin-level" mean?

URL-level checks are specific to the exact page you entered. Origin-level checks apply site-wide, like robots.txt rules and aggregate performance data.