Guides/Indexing Troubleshooter

JavaScript Pages Not Indexed: Fix SPA and Framework Rendering for Google

Your JavaScript application looks perfect in the browser but Google sees an empty page. Understand exactly how Google's two-wave indexing process handles JavaScript content and what you need to change.

Updated: Apr 1, 2026

React, Vue, Angular, Next.js, and other JavaScript frameworks power millions of websites, but there is a fundamental tension between client-side rendering and how Google indexes pages.

Google processes pages in two waves. The first wave parses raw HTML immediately. The second wave, which may happen hours or days later, renders JavaScript. If your HTML is an empty div with a script tag, Google's first wave sees no content. The rendering queue may take days, and failures from API timeouts or JS errors can result in permanent thin-content classification.

This guide covers diagnostics for rendering-based indexing failures and practical solutions for every major framework.

IndexBolt gets your URLs crawled by Google in under 24 hours — no manual submissions, no waiting weeks.

How Google's Two-Wave Indexing Actually Works

Understanding Google's rendering pipeline is essential for diagnosing and fixing JavaScript indexing problems. The process works in distinct stages, and problems at any stage can prevent your content from being indexed.

In the first stage, Googlebot sends an HTTP request for your URL and receives the HTML response. This is identical to what you would see by viewing the page source (Ctrl+U) in your browser. Googlebot parses this HTML to extract text content, links, meta tags, canonical tags, and structured data. Any content present in this initial HTML response is immediately available for indexing. This stage is fast and happens during the normal crawl process.

In the second stage, the URL is placed in a rendering queue. Google operates a Web Rendering Service (WRS) that uses a headless Chromium browser to execute JavaScript and produce the final rendered DOM. The WRS loads your page, executes all scripts, waits for dynamic content to load, and then captures the final page state. This rendered DOM is compared against the initial HTML. Any new content discovered in the rendered DOM is added to Google's index.

The critical insight is that the rendering queue is not real-time. Google has acknowledged that the rendering queue can have significant delays because executing JavaScript is resource-intensive. While Google has improved rendering speed substantially over the years, the queue can still introduce delays of several hours to several days, especially for pages from lower-authority domains or sites that Google does not prioritize for frequent crawling.

During the waiting period between the first wave and the second wave, your page may be evaluated based solely on its initial HTML content. If that initial HTML contains just a root div, a loading animation, and a JavaScript bundle reference, Google evaluates the page as having no meaningful content. This evaluation can result in the page being deprioritized for rendering or classified as thin content before the second wave even has a chance to process it.

The rendering queue is also subject to failures. If your JavaScript throws an error during rendering, if an API call times out, if the page requires authentication, or if a third-party script blocks execution, the WRS may produce an incomplete or empty render. Google does not retry failed renders immediately, which means a transient error during rendering can prevent indexing for an extended period.

Google's WRS runs a modern version of Chromium that supports ES6+, async/await, fetch, Web Components, and most modern JavaScript APIs. However, it does not support all browser APIs. Notably, it does not have access to localStorage, sessionStorage, or IndexedDB in a meaningful way. It does not support user interaction events (scrolling, clicking, hovering), which means content loaded by scroll events, click-to-expand elements, or hover-triggered popups is invisible to Google. The WRS also has a timeout for JavaScript execution. If your page takes longer than about five seconds to render its critical content, the WRS may capture an incomplete state.

Client-Side Rendering vs SSR vs SSG: The Indexing Impact

The rendering strategy you choose for your JavaScript application has a direct and measurable impact on indexing outcomes. Understanding the tradeoffs between client-side rendering (CSR), server-side rendering (SSR), and static site generation (SSG) is essential for making informed architectural decisions.

Client-side rendering is the default for single-page applications built with Create React App, Vue CLI, or Angular CLI. The server sends a minimal HTML document containing a root element and script tags. All content rendering happens in the browser after JavaScript loads and executes. From Google's perspective, CSR pages are empty until the rendering queue processes them, which creates the delays and failure risks described above. CSR is the highest-risk rendering strategy for SEO and indexing.

Server-side rendering generates the full HTML on the server for every request. When Googlebot requests a page, the server runs the JavaScript framework, produces the complete HTML, and sends it in the response. Google's first wave immediately sees all content without needing to wait for the rendering queue. After the HTML loads in a real user's browser, JavaScript hydrates the page to make it interactive. SSR provides the best of both worlds: immediate content availability for Google and full interactivity for users. Frameworks like Next.js, Nuxt, and SvelteKit provide SSR capabilities built-in.

Static site generation pre-renders pages at build time rather than on each request. The result is a collection of static HTML files that are served directly from a CDN. Google receives complete HTML instantly, and there are no server-side rendering delays or failures. SSG is ideal for content that does not change with every request: blog posts, documentation, marketing pages, and product catalogs that update periodically rather than in real-time. The limitation is that SSG requires a rebuild to update content, which makes it impractical for highly dynamic pages like dashboards, search results, or real-time inventory displays.

Incremental Static Regeneration (ISR), available in Next.js and similar frameworks, combines SSG with periodic re-rendering. Pages are pre-rendered at build time but can be regenerated at defined intervals or on-demand when content changes. This provides the indexing reliability of SSG with the content freshness of SSR. For ecommerce sites with thousands of product pages that change periodically, ISR is often the optimal choice.

The practical recommendation is clear: avoid pure client-side rendering for any page you want Google to index. Use SSR for dynamic pages that change per request and SSG or ISR for content that changes periodically. If migrating from CSR to SSR is not feasible in the short term, implement dynamic rendering as a bridge solution.

Skip the manual work — IndexBolt submits URLs directly to Google's crawl queue. Start with 100 free credits.

100 free credits. No credit card required.

Diagnosing JavaScript Rendering Failures

Identifying whether JavaScript rendering is causing your indexing problems requires systematic testing with tools that show you exactly what Google sees at each stage of the indexing pipeline.

The primary diagnostic tool is Google Search Console's URL Inspection feature. Enter the URL of an unindexed JavaScript-rendered page and click "Test Live URL." The tool performs a live crawl and render of the page, simulating Google's actual processing pipeline. Review two outputs: the rendered page screenshot (which shows what Google sees after JavaScript execution) and the tested page details (which show the raw HTML and any resource loading errors). Compare the screenshot to what you see in your browser. If the screenshot shows incomplete content, missing sections, or a blank page, Google is failing to render your JavaScript correctly.

The second diagnostic step is examining the raw HTML source. Open your page URL in a browser and press Ctrl+U to view the unrendered source. This is what Google sees in the first indexing wave. If the source shows meaningful content (titles, headings, text paragraphs, links), your content is available immediately. If it shows only a div with an ID like "root" or "app" and script tags, your content is entirely dependent on JavaScript rendering.

Third, check for JavaScript errors that might prevent rendering. In your browser's developer tools, open the Console tab and reload the page. Note any red error messages. These same errors will occur in Google's rendering environment and may prevent content from loading. Common culprits include API calls to servers that reject requests without proper authentication headers (Google's renderer does not send cookies or authentication tokens), CORS errors on third-party resources, and references to browser-specific APIs that do not exist in headless Chromium.

Fourth, test rendering speed. Google's WRS has a timeout for JavaScript execution. In your browser's developer tools, open the Performance tab and record a page load. If your critical content takes more than three to five seconds to appear after the initial HTML loads, Google may time out before the content renders. Slow API responses, large JavaScript bundles, and complex rendering logic can push your page past the rendering timeout.

Fifth, check for content behind interaction gates. Some JavaScript applications load content only in response to user interactions: clicking a "Load More" button, scrolling past a threshold, or selecting a tab. Google's renderer does not perform these interactions. Content hidden behind interaction requirements will never be visible to Google. Ensure all content you want indexed is visible on the initial page render without any user interaction required.

Framework-Specific Fixes for Common Indexing Blockers

Each JavaScript framework has its own set of common patterns that cause indexing problems. Knowing the framework-specific pitfalls and fixes can save hours of debugging.

For React applications built with Create React App (CRA), the entire application is client-side rendered by default. There is no built-in SSR capability. The recommended migration path is to move to Next.js, which provides SSR and SSG out of the box while using the same React component model. If migrating to Next.js is not feasible, implement dynamic rendering using a pre-rendering service that generates static HTML snapshots for Google's crawler.

For Next.js applications, most indexing problems stem from pages that use only client-side data fetching (useEffect + fetch) instead of server-side data fetching (getServerSideProps or getStaticProps, or the newer App Router server components). If your page's content is loaded inside a useEffect hook, it will not appear in the server-rendered HTML. Move data fetching to the server layer. In the App Router, use Server Components by default and only add "use client" to components that genuinely need interactivity. In the Pages Router, use getServerSideProps for dynamic data and getStaticProps for static data.

For Vue applications using Nuxt, similar principles apply. Use asyncData or useFetch in the server context rather than client-side-only fetch calls. Nuxt's default SSR mode renders content on the server, but plugins and components that run only on the client side can create holes in the rendered output. Use the ClientOnly wrapper component explicitly for client-side-only content and ensure it is not used around critical indexable content.

For Angular applications, Angular Universal provides server-side rendering. Implement Angular Universal and ensure your main content components are rendered on the server. Watch for components that reference browser-specific objects like window, document, or navigator directly, as these will throw errors during server-side rendering and may prevent the page from rendering at all.

For all frameworks, pay special attention to lazy loading and code splitting. Dynamic imports that load components on demand (React.lazy, dynamic imports in Vue and Angular) can prevent content from being present in the initial server render. Critical above-the-fold content should never be lazy loaded. Move key content components into the main bundle to ensure they render in the initial HTML response.

Web Components present their own challenges. While Google can render standard Web Components, complex Shadow DOM structures and deeply nested custom elements may not render reliably in the WRS. Test Web Component rendering explicitly using the URL Inspection tool and consider flattening critical content out of Shadow DOM for better indexing reliability.

Dynamic Rendering as a Bridge Solution

Dynamic rendering is a technique where your server detects whether the incoming request is from a search engine crawler or a regular user and serves different content accordingly. Crawler requests receive a fully pre-rendered static HTML version of the page, while user requests receive the normal JavaScript application. Google has explicitly endorsed dynamic rendering as an acceptable approach for sites that cannot implement full SSR.

The setup involves three components. First, a user-agent detection mechanism on your server or CDN that identifies requests from Googlebot and other search engine crawlers. Googlebot identifies itself with a user-agent string containing "Googlebot," which is straightforward to match. Second, a pre-rendering service (like Puppeteer, Rendertron, or a commercial service like Prerender.io) that generates and caches static HTML snapshots of your JavaScript pages. Third, routing logic that serves the pre-rendered HTML to detected crawler requests and the normal JavaScript application to all other requests.

Dynamic rendering is explicitly not cloaking, which is against Google's guidelines. Cloaking involves showing completely different content to users and crawlers to manipulate rankings. Dynamic rendering shows the same content in a different technical format. The pre-rendered HTML snapshot should contain exactly the same text, images, links, and structured data as the JavaScript-rendered version. Google has repeatedly confirmed this distinction.

Implementation approaches vary by infrastructure. For Node.js servers, you can use middleware that checks the user-agent header and routes Googlebot requests through Puppeteer or Rendertron. For sites behind a CDN like Cloudflare, you can use edge workers to intercept Googlebot requests and serve cached pre-rendered pages. For static hosting with an API backend, a pre-rendering service can generate HTML snapshots during your build or deployment process.

The caching strategy for pre-rendered pages matters. Pre-rendered snapshots become stale as your content changes. For static content like blog posts or documentation, snapshots can be cached for days or weeks. For dynamic content like product pages with changing prices and availability, snapshots should be regenerated at least daily. For real-time content like stock tickers or live scores, dynamic rendering may not be appropriate because the cached snapshot will always be outdated.

Dynamic rendering should be viewed as a transitional solution, not a permanent architecture. The long-term best practice is to implement native server-side rendering so that all users and crawlers receive the same response. Dynamic rendering adds complexity, maintenance burden, and a potential point of failure (if the pre-rendering service goes down, Googlebot sees broken pages). Use it as a bridge while migrating to SSR or SSG.

Step-by-Step Guide

1

Determine Your Current Rendering Strategy

Open one of your unindexed pages and view the HTML source (Ctrl+U, not the developer tools inspector). Look at the body content. If you see a single div with an ID like "root" or "app" and one or more script tags, your page uses pure client-side rendering. If you see full HTML content including headings, paragraphs, and structured data, your page is at least partially server-rendered. Document which rendering strategy each section of your site uses. Many modern applications use a mix: server-rendered navigation and layout with client-rendered content areas.

2

Test Pages with Google's URL Inspection Tool

In Google Search Console, enter the URL of an unindexed page and click "Test Live URL." Wait for the test to complete and then review the results. Examine the rendered page screenshot to see if your content appears. Check the "More Info" section for any resource loading errors, JavaScript errors, or blocked resources. If the screenshot shows your full page content but the page is still not indexed, the rendering is working but there may be other issues (thin content, noindex tags, canonical conflicts). If the screenshot shows missing content or a blank page, rendering failures are blocking indexing. Test five to ten pages across different sections of your site to identify patterns.

3

Identify and Fix JavaScript Rendering Errors

In the URL Inspection tool results, check for blocked resources and page resource errors. Common problems include API calls that fail because Google's renderer does not send authentication cookies, CORS-blocked cross-origin resources, references to browser APIs not available in Google's headless renderer (window.localStorage, navigator.geolocation), and third-party scripts that block rendering. For each error, determine if it affects critical content. Fix API authentication by providing public endpoints for content data. Replace browser-specific API calls with server-side alternatives or provide fallback behavior when the APIs are unavailable.

4

Implement Server-Side Rendering or Static Generation

Based on your framework, implement the appropriate server rendering strategy. For React CRA projects, migrate to Next.js with its App Router and Server Components. For Vue CLI projects, migrate to Nuxt with its built-in SSR. For Angular projects, implement Angular Universal. For existing Next.js or Nuxt projects that use client-side data fetching, move data fetching to server-side functions (getServerSideProps, getStaticProps, server components, or asyncData). After implementing SSR or SSG, verify by viewing the page source and confirming that content appears in the raw HTML without JavaScript execution.

5

Implement Dynamic Rendering as a Short-Term Fix If Needed

If migrating to SSR is not immediately feasible, implement dynamic rendering as a bridge solution. Set up a pre-rendering service (Puppeteer, Rendertron, or Prerender.io). Configure your server or CDN to detect Googlebot requests by user-agent string and route them to the pre-rendered HTML. Test by using curl with a Googlebot user-agent string to verify that the pre-rendered HTML is served correctly. Compare the pre-rendered HTML against the normal page to ensure content parity. Set up automated pre-render cache refreshes to keep snapshots current with your content.

6

Verify Fixes and Submit for Indexing

After implementing SSR, SSG, or dynamic rendering, re-test your pages with the URL Inspection tool. Verify that the rendered screenshot shows complete content and that no resource errors are reported. View the page source to confirm content is in the initial HTML. Once verified, use the URL Inspection tool to request indexing for your highest-priority pages. For sites with many pages affected by JavaScript rendering issues, use IndexBolt to submit all fixed URLs in bulk for rapid indexing. Monitor the Pages report over the following two weeks to track the indexing recovery.

7

Set Up Ongoing Monitoring for Rendering Regressions

JavaScript rendering issues can reappear after code deployments, dependency updates, or API changes. Set up automated monitoring to catch regressions early. Use Lighthouse CI or a similar tool in your CI/CD pipeline to test server-rendered HTML output for key pages. Configure alerts for rendering failures in your pre-rendering service. Schedule weekly checks of Google Search Console's Pages report to catch any new spikes in "Crawled - currently not indexed" pages that could indicate a rendering regression. Document your rendering architecture and SSR requirements so that new team members do not accidentally introduce client-side-only content patterns.

Done with the manual steps? Speed things up.

IndexBolt submits your URLs directly to Google — most get crawled in under 24 hours.

Common Issues & How to Fix Them

React app shows a blank page or loading spinner in URL Inspection tool

Cause: The application uses pure client-side rendering (Create React App or similar) where the server sends an empty HTML shell and JavaScript builds the entire page in the browser. Google's first indexing wave sees no content, and the rendering queue may fail to process the page due to API errors, long loading times, or resource-intensive rendering.

Fix: Migrate to Next.js for built-in server-side rendering. In the interim, implement dynamic rendering with Prerender.io or a self-hosted Rendertron instance. Ensure API endpoints used by the application are publicly accessible without authentication so Google's renderer can fetch data. If migration is not possible immediately, at minimum add critical meta tags (title, description, canonical) and key heading text to the static HTML template that the server sends before JavaScript loads.

Next.js pages indexed but showing outdated content in Google results

Cause: Pages using Static Site Generation (getStaticProps) are pre-rendered at build time, and Google has cached the build-time version. If your content changes between builds, the indexed version becomes stale. This is not a rendering failure but a freshness problem. Pages using Incremental Static Regeneration may also show stale content if the revalidation interval is longer than Google's crawl frequency.

Fix: For pages with frequently changing content, switch from SSG to SSR (getServerSideProps or dynamic App Router pages). For pages using ISR, reduce the revalidation interval to match your content update frequency. After a major content update, use IndexBolt or Search Console to request re-crawling of updated pages so Google picks up the latest version. Consider implementing on-demand ISR that triggers page regeneration when content is updated rather than relying on time-based revalidation.

Single-page application with hash-based routing not getting inner pages indexed

Cause: Hash-based routing (URLs like yourdomain.com/#/about, yourdomain.com/#/products) is invisible to Google. Google treats everything after the # as a fragment identifier and does not send it to the server. All hash-based URLs resolve to the same page from Google's perspective. Google's WRS does not navigate between hash routes, so only the initial page content is rendered.

Fix: Migrate from hash-based routing to history-based routing (clean URLs like yourdomain.com/about, yourdomain.com/products). In React Router, switch from HashRouter to BrowserRouter. In Vue Router, set the mode to "history" instead of "hash." Configure your server to handle all route paths and return the application's HTML for any URL path. After migration, submit the new clean URLs through Google Search Console or IndexBolt and monitor indexing of each route.

Lazy-loaded components and images not appearing in Google's render

Cause: Content loaded via React.lazy(), dynamic imports, or intersection observer-based lazy loading may not render in Google's WRS if it depends on scroll position or viewport intersection events. Google's renderer loads the page but does not scroll or resize the viewport, so content triggered by scroll events remains unloaded.

Fix: Never lazy-load critical above-the-fold content. For below-the-fold content that you want indexed, use native lazy loading (loading="lazy" attribute on images) which Google supports, rather than JavaScript-based intersection observer lazy loading which requires scroll events. For dynamically imported components that contain indexable content, load them eagerly on the server side and lazy-load only on the client side for performance. Include a noscript fallback with critical content for maximum rendering resilience.

API-driven content pages failing to render because API requires authentication

Cause: Many JavaScript applications fetch content from APIs that require authentication tokens, API keys in headers, or session cookies. Google's WRS does not have access to your authentication system, cannot log in, and does not send authentication cookies. API calls that return 401 or 403 errors during Google's rendering result in empty content areas.

Fix: For content that should be publicly accessible, create public API endpoints that do not require authentication. Serve the page content that Google needs to see from public endpoints while keeping user-specific data (account details, personalization) behind authenticated endpoints. Implement server-side data fetching where the server authenticates with the API and includes the content in the HTML response before sending it to the browser. This eliminates the need for client-side API authentication during rendering.

Pro Tips

Add essential content (title, heading, first paragraph) directly to the HTML template as a fallback for first-wave indexing.
Use the Mobile-Friendly Test tool for a quick rendered view of your page using Google's own rendering engine.
Keep JS bundles small; Google's WRS has a five-second rendering budget, so use code splitting aggressively.
Disable JavaScript in Chrome DevTools to see exactly what Google's first indexing wave evaluates.
Place JSON-LD structured data in server-rendered HTML, not injected via JavaScript, for immediate processing.
Ensure service workers do not cache blank shell pages and serve them to Googlebot instead of real content.

JavaScript rendering delays mean your pages can wait days in Google's rendering queue before being indexed. IndexBolt bypasses the wait by submitting your URLs directly to Google's indexing pipeline. Whether you use React, Vue, Angular, or Next.js, get your JavaScript-rendered pages indexed immediately instead of hoping Google's renderer processes them correctly.

100 free credits. No credit card required. See results in under 24 hours.

Frequently Asked Questions

Can Google actually render JavaScript pages reliably?+

Google's WRS uses modern Chromium and renders most JS content, but it is not instantaneous (queue delays), not 100% reliable (API failures cause incomplete renders), and not universal (no interaction APIs). For important pages, relying entirely on JS rendering is risky. SSR eliminates that risk by ensuring content is always in the initial HTML.

Is server-side rendering necessary for SEO, or is it just recommended?+

Not technically necessary. Google can index CSR content. But SSR provides dramatically better reliability, speed, and coverage. CSR pages face rendering queue delays, failure risks, and crawl deprioritization. SSR content is evaluated immediately in the first wave. For any page where organic traffic matters, SSR is strongly recommended. CSR is fine for authenticated dashboards and admin panels.

How do I know if Google is successfully rendering my JavaScript pages?+

Use Google Search Console's URL Inspection tool to see exactly what Google renders. Enter a URL, click "Test Live URL," and compare the rendered screenshot with what you see in your browser. If the content matches, rendering is working. Also check the "Crawled page" section for the HTML that Google received and the "More info" section for any resource loading errors. For ongoing monitoring, track the ratio of submitted pages (in your sitemap) versus indexed pages in the Pages report. A large gap suggests rendering or quality issues are preventing indexing.

What version of Chrome does Googlebot use for rendering?+

Google's Web Rendering Service runs an evergreen version of Chromium, meaning it stays up to date with the latest stable Chrome release. As of 2026, this means full support for modern JavaScript (ES2022+), CSS Grid, Flexbox, Web Components, dynamic imports, IntersectionObserver, and most modern web platform features. However, the rendering environment does not support user interaction APIs (no click, scroll, hover, or keyboard events), does not support payment or notification APIs, and has limited localStorage/sessionStorage support. Check Google's official documentation for the current list of supported and unsupported APIs.

Should I use dynamic rendering or switch to server-side rendering?+

If you have the engineering resources to implement SSR, it is always the better long-term choice. SSR benefits all users (faster initial page loads, better accessibility, improved Core Web Vitals), not just search engine crawlers. Dynamic rendering is a valid short-term solution for teams that cannot migrate to SSR quickly, but it adds maintenance complexity (you must keep the pre-rendering service running and the cache fresh) and creates a system where users and crawlers see technically different responses. Use dynamic rendering as a bridge while planning and executing an SSR migration.

My Next.js app uses Server Components. Why are pages still not indexed?+

Next.js Server Components are server-rendered by default, which should provide excellent indexing. Common issues include: importing a component with "use client" that wraps the main content (forcing it to render on the client), fetching data inside a client component instead of passing it as props from a server component, having a loading.tsx file that shows a skeleton instead of content during server rendering, or middleware that redirects Googlebot to a different page. Check your component tree and ensure the primary content-bearing components are Server Components (no "use client" directive) and that data fetching happens at the server level.

Ready to get your URLs indexed?

Start with 100 free credits. No credit card required.