- Home
- GEO Factors
- Technical
- JavaScript Rendering
JavaScript Rendering
Compares raw HTML vs rendered content to detect critical content hidden behind JavaScript. Most AI crawlers don't execute JS.
Why It Matters for AI Visibility
How We Score It
How to Improve
- 1
Switch to server-side rendering for content-heavy pages
If you use React, Vue, or Angular, adopt SSR (server-side rendering) or SSG (static site generation) for pages that need AI visibility. Next.js, Nuxt, and Angular Universal all support this. SSR ensures your content is in the HTML before any JavaScript executes, making it immediately visible to AI crawlers.
- 2
Pre-render critical content in the initial HTML payload
Even without full SSR, ensure headlines, body text, and key data are in the initial HTML. Lazy-load interactive widgets and non-essential UI elements, but never lazy-load the primary content. The first HTML response should contain everything an AI crawler needs to understand the page.
- 3
Audit your JS dependency with a raw HTML test
View your page's source (not the inspector -- the actual page source). If the main content area is empty or contains only JavaScript bundle references, AI crawlers see the same emptiness. Compare the source to the rendered page and identify which content blocks are missing.
- 4
Move API-fetched content to server-side data loading
If your page fetches content from an API on the client side (useEffect, fetch on mount), move that data fetching to the server. In Next.js, use server components or getServerSideProps. The data should arrive as HTML, not as a client-side API call that AI crawlers will never execute.
- 5
Test with JavaScript disabled in your browser
Disable JavaScript in your browser settings and load your page. What you see is what AI crawlers see. If the page is blank or missing core content, you have a JavaScript rendering dependency that needs fixing.
Before & After
<!-- Raw HTML source --> <div id="root"></div> <script src="/bundle.js"></script> <!-- All product info loaded via JS: 85% content difference --> <!-- Score: 1 -->
<!-- Server-rendered HTML -->
<div id="root">
<h1>Wireless Noise-Canceling Headphones</h1>
<p>Premium audio with 30-hour battery life...</p>
<ul>
<li>Active noise cancellation</li>
<li>Bluetooth 5.3</li>
<li>30-hour battery</li>
</ul>
</div>
<script src="/bundle.js"></script>
<!-- Content in HTML, JS hydrates for interactivity: 5% difference -->
<!-- Score: 10 -->Code Examples
Next.js server component (content visible to AI crawlers)
// app/products/[id]/page.tsx -- Server Component (default in Next.js App Router)
export default async function ProductPage({ params }: { params: { id: string } }) {
const product = await getProduct(params.id); // Fetched on server
return (
<main>
<h1>{product.name}</h1>
<p>{product.description}</p>
<ul>
{product.features.map(f => <li key={f}>{f}</li>)}
</ul>
</main>
);
}Client-only rendering (invisible to AI crawlers)
// Avoid this pattern for content pages
'use client';
import { useEffect, useState } from 'react';
export default function ProductPage({ params }) {
const [product, setProduct] = useState(null);
useEffect(() => {
fetch(`/api/products/${params.id}`)
.then(res => res.json())
.then(setProduct);
}, []);
if (!product) return <div>Loading...</div>;
return <h1>{product.name}</h1>;
}Frequently Asked Questions
Do AI crawlers execute JavaScript at all?
Most do not. GPTBot, ClaudeBot, and PerplexityBot primarily read raw HTML. Googlebot has limited JavaScript rendering capability, but it is slower and less reliable than raw HTML parsing. For consistent AI visibility across all engines, your content must be in the initial HTML response.
My site uses React -- am I automatically penalized?
Not if you use server-side rendering (SSR) or static site generation (SSG). The issue is client-side-only rendering where content is absent from the initial HTML. Next.js with server components, Gatsby with static generation, or Remix with SSR all produce HTML that AI crawlers can read. The framework is not the problem -- the rendering strategy is.
What percentage of JS-rendered content is acceptable?
Under 10% difference earns a perfect score of 10. Under 20% is still good at 8 out of 10. Above 30% starts to seriously impact your AI visibility. The goal is to keep all primary content in the server-rendered HTML and limit JavaScript-only content to interactive UI elements like modals, tooltips, and animations.
Check Your GEO Score
Run a free analysis on your website and see how you score across all 52 factors.
Analyze My Sitellms.txt File
NextPage Speed