The right service for where you are right now.
From a 48-hour website launch to ongoing growth and automation. Start with what you need, add more when you are ready.
From a 48-hour website launch to ongoing growth and automation. Start with what you need, add more when you are ready.
Every industry gets a custom setup — not a template with your logo swapped in. Website, automation, and lead capture tailored to your vertical.
Why public-facing React SPAs can disappear from search engines and how to fix it without a rewrite
We built a 198-page client-rendered React SPA with perfect Lighthouse scores and every SEO best practice we knew. Google indexed 20 pages in six months. This is the story of what went wrong, why HTML still matters in 2026, and the build-time prerendering fix that solved it.
We shipped 198 pages. Google indexed 20.
Not because the content was thin. Not because of a penalty. Not because we forgot a sitemap. Google simply could not see our website.
For six months we watched Google Search Console tell us the same thing. The main statuses were 147 pages stuck in "Discovered - currently not indexed," another 13 labeled "Crawled - currently not indexed," and just 20 pages actually appearing in search results.
We had structured data. We had canonical tags. We had react-helmet-async injecting unique titles and descriptions on every page. We had a sitemap, an RSS feed, internal linking, and Open Graph tags. We checked every box on every SEO checklist we could find.
None of it mattered. Here is what we missed, why it happened, and exactly how we fixed it.
This is not a case against React itself. Next.js is React. Astro can render React components too. The problem is public-facing, client-rendered React SPAs that send almost no meaningful HTML on the first request.
Open your React SPA in a browser. You see a polished website with animations, images, blog articles, service pages. Everything looks perfect.
Now right-click, View Page Source.
You see this:
<!doctype html>
<html lang="en">
<head>
<title>Your Full-Stack Digital Partner for Growth</title>
</head>
<body>
<div id="root"></div>
<script type="module" src="/src/main.tsx"></script>
</body>
</html>
An empty div and a JavaScript file. That is the entirety of what your server sends to every visitor, every crawler, every search engine, on every single URL.
When a regular browser loads this, JavaScript executes, React mounts, the router reads the URL, the correct component renders, and react-helmet-async swaps in the right title and meta tags. It all happens in milliseconds. You never notice.
Googlebot works differently.
Google crawls the web in two phases:
Phase 1: Crawl. Googlebot fetches the raw HTML. It reads the <title>, <meta> tags, headings, text content, and links. If the page is static HTML, this is enough. Google indexes it and moves on.
Phase 2: Render. If the page relies on JavaScript, Google adds it to a rendering queue. A separate service (the Web Rendering Service) eventually spins up a headless Chromium instance, executes the JavaScript, and extracts the rendered DOM. Google's own JavaScript SEO basics and Page indexing report documentation both point to this crawl-then-render split once you know where to look.
That rendering queue is the problem. Google processes billions of pages. JavaScript rendering is expensive. Your 198-page agency site is competing for rendering resources with every other JavaScript-heavy site on the internet.
The result for a React SPA:
/blog/your-article<div id="root"></div> for every URLThis is why our Search Console showed "Discovered - currently not indexed" for 147 pages. Google found the URLs in our sitemap, looked at the HTML, saw an empty shell, and put them in a queue it was in no hurry to process.
JavaScript frameworks exist because developers need them. Single-page applications deliver smooth transitions, state management, and fast in-app navigation. These are real benefits for the user experience.
But search engines, social media scrapers, many AI crawlers, and accessibility tools all process HTML first. Many never execute JavaScript at all.
When you share a link on LinkedIn, Slack, or iMessage, the preview card is generated from the raw HTML. If your og:title and og:description are injected by JavaScript, those services will show either nothing or your default fallback meta tags. The same link shared from a static HTML page shows the correct title, description, and image every time.
Accessibility readers process the initial HTML. RSS readers process the initial HTML. Prerender services process the initial HTML. Google's initial crawl processes the initial HTML.
The first 50 milliseconds of a page load — before any JavaScript runs — determine how the entire internet sees your site. If that HTML is empty, your site is invisible to most of the systems that drive traffic and discoverability.
This is not an argument against React or JavaScript frameworks. It is an argument for making sure your HTML is complete before JavaScript enhances it. That same assumption sits underneath our local business website development guide and our technical SEO optimization guide: crawlers have to receive meaningful HTML before any of the rest of your SEO work can matter.
The solution was to generate static HTML for every route at build time. After vite build produces the JavaScript bundle, a post-build script:
dist/ folderdist/ so the hosting platform serves real contentvite build → dist/ (empty SPA shell)
prerender → dist/ (198 static HTML pages with full content)
deploy → each URL serves complete HTML + JS for interactivity
The key details that made this work:
Monkey-patching IntersectionObserver during prerendering. The prerender script replaces the browser's IntersectionObserver with a version that immediately reports every element as visible. This forces DeferredSection to render all children without scrolling.
await page.evaluateOnNewDocument(() => {
window.__PRERENDER = true;
window.IntersectionObserver = class {
constructor(callback) { this._cb = callback; }
observe(el) {
this._cb([{ isIntersecting: true, target: el }], this);
}
unobserve() {}
disconnect() {}
};
});
Disabling opacity: 0 animations during prerendering. Components like ScrollReveal check a global flag and skip animation styles when prerendering:
const isPrerendering =
typeof window !== "undefined" && window.__PRERENDER === true;
if (isPrerendering) {
return <Tag className={className}>{children}</Tag>;
}
// Otherwise, render with opacity: 0 and scroll animation
Using networkidle0 wait strategy. The script waits until all network requests settle before capturing HTML. This ensures lazy-loaded chunks and data fetches complete.
Reusable browser tab pool. Instead of opening and closing a tab per page, the script creates 10 tabs upfront and reuses them across all 198 URLs. This cut prerender time significantly.
If you deploy to Vercel, Puppeteer will not work out of the box during the build step. Vercel's build containers lack the system libraries Chromium needs (libnspr4.so, libatk-1.0.so, and others).
The fix is to use @sparticuz/chromium, a Chromium binary built for restricted environments. It bundles all required shared libraries and runs without root access.
async function launchBrowser() {
if (process.env.VERCEL || process.env.CI) {
const chromium = (await import("@sparticuz/chromium")).default;
const puppeteerCore = (await import("puppeteer-core")).default;
return puppeteerCore.launch({
args: chromium.args,
executablePath: await chromium.executablePath(),
headless: true,
});
}
// Locally: regular puppeteer works fine
const puppeteer = (await import("puppeteer")).default;
return puppeteer.launch({ headless: true });
}
Install both as dev dependencies:
npm install --save-dev puppeteer-core @sparticuz/chromium
Keep puppeteer for local development. The script detects the environment and uses the right binary.
If you are an agency building client websites, the framework choice determines how much SEO work you will do later. Here is a direct comparison:
If you build marketing sites, service pages, or content hubs for clients, this is a rendering decision before it is a framework preference. That is true whether you are shipping a brochure site, a location-page strategy, or a larger web development services engagement.
React SPA (Vite, Create React App)
Next.js (App Router)
generateStaticParams for static generation at build timegenerateMetadata for per-page SEO tagsAstro
Static HTML
The decision matrix is simple:
We built our 198-page site as a React SPA because we are a development agency and React is our core stack. That decision cost us six months of search visibility. The prerendering fix works, but if we were starting over, we would use Next.js or Astro for every public-facing page.
If you already have a React SPA in production and cannot migrate to a different framework, run through this checklist:
HTML delivery
<title> and <meta name="description"> in the raw HTML?Content visibility
IntersectionObserver? Google does not scroll.opacity: 0 or display: none and rely on JavaScript to become visible?fill-mode: both or fill-mode: backwards with a starting opacity of 0?React.lazy() chunks loading fast enough for the renderer to capture them?Technical SEO
<link rel="canonical"> in the raw HTML?www to non-www redirect (or vice versa) work correctly for all paths?Prerendering (if implementing)
IntersectionObserver being patched during prerendering?opacity: 0 styles being stripped during prerendering?After deploying pre-rendered HTML for all 198 pages:
The fix was not a migration. We kept our React + Vite + React Router stack. We added a build step that generates the HTML Google needs, without changing how the site works for actual visitors.
Does Google execute JavaScript?
Yes. Google's Web Rendering Service (WRS) uses a headless Chromium instance to render JavaScript-heavy pages. But rendering is a separate, delayed step. Pages enter a queue that can take days to weeks. Static HTML is usually much easier for Google to process on the first crawl because the content is already there.
Is react-helmet-async enough for SEO?
No, not on its own. react-helmet-async injects <title> and <meta> tags into the DOM after JavaScript executes. The raw HTML still contains your default fallback tags. Social media crawlers and the initial Google crawl see the fallback, not the page-specific tags.
Can I use a prerender service instead of build-time prerendering?
Yes. Services like Prerender.io detect crawler user agents and serve cached rendered HTML. The trade-off is cost (after the free tier) and a dependency on an external service. Build-time prerendering is free and self-contained.
Will prerendering break my SPA routing?
No. The pre-rendered HTML files are served for the initial page load. Once JavaScript boots, React takes over and client-side routing works normally. For hosting platforms like Vercel, static files are served first; if no static file matches the URL, the SPA fallback rewrite kicks in.
Should I migrate from React SPA to Next.js?
If you are starting a new project that needs search visibility, yes. Next.js handles server rendering, metadata, and static generation out of the box. If you have an existing SPA with hundreds of components, prerendering at build time is a faster path to indexability than a full rewrite.
What about Bing, social media crawlers, and AI chatbots?
Bing's crawler has a more limited JavaScript rendering budget than Google. Social media platforms (LinkedIn, X, Facebook) and messaging apps almost never execute JavaScript — they rely entirely on raw HTML meta tags. Many AI crawlers and answer-engine fetchers also appear to rely heavily on raw HTML. Pre-rendered HTML helps with all of these, not just Google.
A comprehensive, step-by-step guide to building a professional local service business website and implementing advanced SEO strategies with Claude Code, Cursor, or another modern AI coding assistant. Perfect for non-technical business owners who want to achieve results typically costing $5,000-$10,000.
Build a professional local business website using Next.js and modern web development practices. Complete guide with AI-powered development, responsive design, and conversion optimization.
Master technical SEO for your local business website. Learn how to conduct comprehensive SEO audits, fix critical technical issues, optimize page speed for Core Web Vitals, and ensure search engines can crawl and index every page.
Our team at Luminous Digital Visions specializes in SEO, web development, and digital marketing. Let us help you achieve your business goals.
Get Free Consultation