WHY THIS QUESTION MATTERS.
If you are debating SSR vs CSR (server-side vs client-side rendering) for a site you want AI engines to cite, the data has gotten unambiguous in the last 18 months. AI bots do not execute JavaScript at any meaningful rate. Single-page apps without server-side rendering are functionally invisible to most AI crawlers.
This is not a marginal effect. It is a binary one. Pages rendered client-side return empty HTML to the bot, which means an empty extraction, which means zero citation potential. The architecture decision compounds across every page on the site.
1.3B FETCHES. NEAR-ZERO JS.
Vercel published the most comprehensive public dataset on this in 2024-2025: 1.3 billion AI-bot fetches across their hosted sites, with JS-execution telemetry. The headline finding: none of the major AI bots executed JavaScript at meaningful rates. GPTBot, ClaudeBot, PerplexityBot, all near-zero.
The result has held up across other observers. From the 47-site research network, the same pattern is visible at smaller scale: pages whose content is JS-rendered get fetched (the bot pulls the empty shell HTML) but never cited (because there is no extractable content).
The exceptions: Googlebot does render JS. Bingbot does too, partially. But the AI-specific user-agents (GPTBot, OAI-SearchBot, ClaudeBot, Claude-SearchBot, PerplexityBot) do not.
THE METHODOLOGY.
You can verify this on your own site in 30 minutes:
- Step 1: Pick a page on your site whose main content is rendered client-side. (If you are not sure, view-source the page; if the body content is in a
<div id="root"></div>shell, it is CSR.) - Step 2: Curl the URL with an AI bot user-agent:
curl -A "GPTBot/1.0 (+https://openai.com/gptbot)" https://yoursite.com/page - Step 3: Compare what you see in curl output vs what you see in the browser. If the curl output is empty / shell-only, your AI visibility on that page is effectively zero.
- Step 4: Repeat with the same URL prerendered or with SSR enabled. The curl output should now contain the actual content.
FRAMEWORK IMPLICATIONS.
Safe by default: Next.js (with default SSR/SSG), Remix, Astro, Eleventy, plain HTML. These produce server-rendered HTML that bots receive immediately.
Conditional: Nuxt, SvelteKit. Default to SSR, but commonly misconfigured. Verify with the curl test above.
Risky by default: Create-React-App, Vite-React without SSR setup, pure Vue SPA, pure Angular SPA. These ship empty HTML by default. Need explicit SSR or static-prerender setup to be AI-visible.
Effectively invisible: any SPA that lazy-loads its core content, infinite-scroll feeds, dashboards. These tend to be invisible to AI bots regardless of framework.
ARCHITECTURE FOLLOWS.
THE MIGRATION CALCULUS.
If you are running a CSR-only site and you want AI visibility, you have three options:
- Full migration to SSR: rebuild on Next.js / Remix / Astro. Highest cost, highest payoff. Right for sites whose primary content needs to be cited.
- Hybrid prerendering: keep the SPA but prerender critical pages at build time (e.g., via
react-snap, Vite's SSG plugins). Medium cost, partial payoff. Right when only the public marketing/content pages need citation visibility. - Service-worker prerender: services like Prerender.io serve pre-rendered HTML to detected bots. Low engineering cost, ongoing service cost, weakest of the three because it depends on bot-detection accuracy. Right for legacy sites that cannot be migrated.
THE BOTTOM LINE.
If you are picking a framework today for a new content-driven site, pick one that produces server-rendered HTML by default. If you have an existing CSR-only site you want cited, run the curl test, count the pages affected, and pick a migration path. Architecture cannot be retrofitted later for free.
The data on this is settled. The decision is not whether to render server-side; it is how to get there.