Say Hello

DO AI BOTS
RENDER JAVASCRIPT?

AI bots overwhelmingly do not execute JavaScript. Vercel measured this against 1.3 billion fetches and published the result. This article corroborates with methodology and lays out the architecture implications - which framework decisions cost you AI visibility and which preserve it.

JS Render Rate
Near Zero
SSR Required
Effectively
Tested Surfaces
4
Architecture Cost
Significant

WHY THIS QUESTION MATTERS.

If you are debating SSR vs CSR (server-side vs client-side rendering) for a site you want AI engines to cite, the data has gotten unambiguous in the last 18 months. AI bots do not execute JavaScript at any meaningful rate. Single-page apps without server-side rendering are functionally invisible to most AI crawlers.

This is not a marginal effect. It is a binary one. Pages rendered client-side return empty HTML to the bot, which means an empty extraction, which means zero citation potential. The architecture decision compounds across every page on the site.

Finding 01.
Vercel's published data

1.3B FETCHES. NEAR-ZERO JS.

Vercel published the most comprehensive public dataset on this in 2024-2025: 1.3 billion AI-bot fetches across their hosted sites, with JS-execution telemetry. The headline finding: none of the major AI bots executed JavaScript at meaningful rates. GPTBot, ClaudeBot, PerplexityBot, all near-zero.

The result has held up across other observers. From the 47-site research network, the same pattern is visible at smaller scale: pages whose content is JS-rendered get fetched (the bot pulls the empty shell HTML) but never cited (because there is no extractable content).

The exceptions: Googlebot does render JS. Bingbot does too, partially. But the AI-specific user-agents (GPTBot, OAI-SearchBot, ClaudeBot, Claude-SearchBot, PerplexityBot) do not.

Finding 02.
How to test on your own site

THE METHODOLOGY.

You can verify this on your own site in 30 minutes:

  • Step 1: Pick a page on your site whose main content is rendered client-side. (If you are not sure, view-source the page; if the body content is in a <div id="root"></div> shell, it is CSR.)
  • Step 2: Curl the URL with an AI bot user-agent: curl -A "GPTBot/1.0 (+https://openai.com/gptbot)" https://yoursite.com/page
  • Step 3: Compare what you see in curl output vs what you see in the browser. If the curl output is empty / shell-only, your AI visibility on that page is effectively zero.
  • Step 4: Repeat with the same URL prerendered or with SSR enabled. The curl output should now contain the actual content.
Finding 03.
What this means for frameworks

FRAMEWORK IMPLICATIONS.

Safe by default: Next.js (with default SSR/SSG), Remix, Astro, Eleventy, plain HTML. These produce server-rendered HTML that bots receive immediately.

Conditional: Nuxt, SvelteKit. Default to SSR, but commonly misconfigured. Verify with the curl test above.

Risky by default: Create-React-App, Vite-React without SSR setup, pure Vue SPA, pure Angular SPA. These ship empty HTML by default. Need explicit SSR or static-prerender setup to be AI-visible.

Effectively invisible: any SPA that lazy-loads its core content, infinite-scroll feeds, dashboards. These tend to be invisible to AI bots regardless of framework.

AI BOTS DO NOT RENDER JAVASCRIPT.
ARCHITECTURE FOLLOWS.

THE MIGRATION CALCULUS.

If you are running a CSR-only site and you want AI visibility, you have three options:

THE BOTTOM LINE.

If you are picking a framework today for a new content-driven site, pick one that produces server-rendered HTML by default. If you have an existing CSR-only site you want cited, run the curl test, count the pages affected, and pick a migration path. Architecture cannot be retrofitted later for free.

The data on this is settled. The decision is not whether to render server-side; it is how to get there.

Stop Guessing What AI Sees

MEASURE THE LEVERS
THAT ACTUALLY EXIST.

If you want this methodology applied to your specific site - your real logs, your real citation data, your real fix list - the audit is the productized way to do it.