TechnicalFree

JavaScript Rendering

Compares raw HTML vs rendered content to detect critical content hidden behind JavaScript. Most AI crawlers don't execute JS.

Why It Matters for AI Visibility

Most AI crawlers do not execute JavaScript. GPTBot (ChatGPT), ClaudeBot (Anthropic), and PerplexityBot read the raw HTML your server sends -- nothing more. If your content is rendered client-side by JavaScript, these crawlers see an empty page or a loading spinner where your content should be. This is a critical gap for any site built with React, Vue, or Angular using client-side rendering. Your users see a fully populated page after JavaScript runs in their browser, but AI engines see the pre-JS HTML skeleton. A product page with descriptions, reviews, and specifications loaded via JavaScript is effectively invisible to AI crawlers. Google AI Overviews have limited JavaScript rendering capability through Googlebot, but it is slower, less reliable, and not guaranteed for every page. Relying on Google's JS rendering means your content may be indexed hours or days later than server-rendered pages, if at all. For AI citation, the content needs to be in the raw HTML. Every percentage of content that exists only after JavaScript execution is content that AI engines cannot cite.

How We Score It

The analyzer renders your page in a headless browser and compares the rendered text against the raw HTML text. It calculates the percentage of rendered words (longer than 3 characters) that are missing from the raw HTML source. Less than 10% difference earns a perfect 10 -- nearly all content is in the initial HTML. Less than 20% scores 8, under 30% scores 6, and under 50% scores 4. Above 50% means most of your content is JavaScript-dependent, and the score drops toward 0. If the browser render fails or times out (15-second limit), the score defaults to 5. The analyzer also extracts sample sentences where the majority of words only appear after JavaScript execution, so you can see exactly which content blocks are invisible to AI crawlers. A score of 7 or higher passes, 4-6 is partial, and 0-3 is a fail.
See how your site scores on this factorAnalyze My Site

How to Improve

  • 1

    Switch to server-side rendering for content-heavy pages

    If you use React, Vue, or Angular, adopt SSR (server-side rendering) or SSG (static site generation) for pages that need AI visibility. Next.js, Nuxt, and Angular Universal all support this. SSR ensures your content is in the HTML before any JavaScript executes, making it immediately visible to AI crawlers.

  • 2

    Pre-render critical content in the initial HTML payload

    Even without full SSR, ensure headlines, body text, and key data are in the initial HTML. Lazy-load interactive widgets and non-essential UI elements, but never lazy-load the primary content. The first HTML response should contain everything an AI crawler needs to understand the page.

  • 3

    Audit your JS dependency with a raw HTML test

    View your page's source (not the inspector -- the actual page source). If the main content area is empty or contains only JavaScript bundle references, AI crawlers see the same emptiness. Compare the source to the rendered page and identify which content blocks are missing.

  • 4

    Move API-fetched content to server-side data loading

    If your page fetches content from an API on the client side (useEffect, fetch on mount), move that data fetching to the server. In Next.js, use server components or getServerSideProps. The data should arrive as HTML, not as a client-side API call that AI crawlers will never execute.

  • 5

    Test with JavaScript disabled in your browser

    Disable JavaScript in your browser settings and load your page. What you see is what AI crawlers see. If the page is blank or missing core content, you have a JavaScript rendering dependency that needs fixing.

Before & After

Before
<!-- Raw HTML source -->
<div id="root"></div>
<script src="/bundle.js"></script>
<!-- All product info loaded via JS: 85% content difference -->
<!-- Score: 1 -->
After
<!-- Server-rendered HTML -->
<div id="root">
  <h1>Wireless Noise-Canceling Headphones</h1>
  <p>Premium audio with 30-hour battery life...</p>
  <ul>
    <li>Active noise cancellation</li>
    <li>Bluetooth 5.3</li>
    <li>30-hour battery</li>
  </ul>
</div>
<script src="/bundle.js"></script>
<!-- Content in HTML, JS hydrates for interactivity: 5% difference -->
<!-- Score: 10 -->

Code Examples

Next.js server component (content visible to AI crawlers)

// app/products/[id]/page.tsx -- Server Component (default in Next.js App Router)
export default async function ProductPage({ params }: { params: { id: string } }) {
  const product = await getProduct(params.id); // Fetched on server

  return (
    <main>
      <h1>{product.name}</h1>
      <p>{product.description}</p>
      <ul>
        {product.features.map(f => <li key={f}>{f}</li>)}
      </ul>
    </main>
  );
}

Client-only rendering (invisible to AI crawlers)

// Avoid this pattern for content pages
'use client';
import { useEffect, useState } from 'react';

export default function ProductPage({ params }) {
  const [product, setProduct] = useState(null);

  useEffect(() => {
    fetch(`/api/products/${params.id}`)
      .then(res => res.json())
      .then(setProduct);
  }, []);

  if (!product) return <div>Loading...</div>;
  return <h1>{product.name}</h1>;
}

Frequently Asked Questions

Do AI crawlers execute JavaScript at all?

Most do not. GPTBot, ClaudeBot, and PerplexityBot primarily read raw HTML. Googlebot has limited JavaScript rendering capability, but it is slower and less reliable than raw HTML parsing. For consistent AI visibility across all engines, your content must be in the initial HTML response.

My site uses React -- am I automatically penalized?

Not if you use server-side rendering (SSR) or static site generation (SSG). The issue is client-side-only rendering where content is absent from the initial HTML. Next.js with server components, Gatsby with static generation, or Remix with SSR all produce HTML that AI crawlers can read. The framework is not the problem -- the rendering strategy is.

What percentage of JS-rendered content is acceptable?

Under 10% difference earns a perfect score of 10. Under 20% is still good at 8 out of 10. Above 30% starts to seriously impact your AI visibility. The goal is to keep all primary content in the server-rendered HTML and limit JavaScript-only content to interactive UI elements like modals, tooltips, and animations.

Related Factors

Check Your GEO Score

Run a free analysis on your website and see how you score across all 52 factors.

Analyze My Site