If you built your website with an AI tool like Lovable or Base44, there's a good chance Google can't read it. This free guide walks you through a permanent fix in three steps — no technical experience required.

Think of your website like flat-pack furniture. When a visitor arrives, your server sends them a box of parts (the code) and a set of instructions (the JavaScript). The visitor's web browser then assembles the furniture on the spot, and the beautiful, functional website appears.
This is efficient, but it creates a major problem for search engines. Google's automated bots are like impatient inspectors — they quickly glance at what you've sent. When they see a box of parts instead of a finished product, they don't wait for the assembly. They see an "empty" page, mark it as having no content, and move on.
This is why your site struggles to rank on Google and why links shared on social media often appear as blank cards.
Instead of shipping a box of parts, we're going to ship the fully-assembled furniture.
This guide implements a process that, before your site goes live, automatically assembles every page. It waits for all the content to load and then saves a complete snapshot of that finished page as a simple, universally readable HTML file.
When Google's bot visits, it receives this pre-assembled, instantly recognisable page. It sees all your content, understands what your site is about, and can rank it accordingly.
<div id="root">
<!-- empty -->
</div>Google sees nothing.
<h1>Your Title</h1>
<p>Your content...</p>
<meta name="desc" />Google reads everything.
It takes about 30 seconds to check. Follow these two steps — if either test shows the problem, the fix in this guide is exactly what you need.
Open your website in Chrome or Safari. Right-click anywhere on the page and choose View Page Source (or press Ctrl+U on Windows, Cmd+Option+U on Mac).
Press Ctrl+F (or Cmd+F) to search and type your homepage headline.
<div id="root">
<!-- empty -->
</div>Your text is nowhere in the source.
<h1>Your headline
here</h1>
<p>Your content</p>Your text appears in the source.
Go to Google Rich Results Test, paste your website URL, and click Test URL.
Wait for the result. Google will show you a screenshot of what it actually sees when it visits your site.
The screenshot shows a blank or mostly empty page — your content isn't visible.
The screenshot shows your full page with text, headings, and content clearly visible.
If either test shows a problem, scroll down to The Fix and follow the three steps. If both tests show no problem, your site is already visible to Google and you don't need this guide — but the What's Next section below still has useful tips for improving your rankings.
Choose your track below based on which tool you used to build your site. Then follow the three steps. Your AI assistant does the hard work — you just copy and paste.
Track A: You'll paste three short prompts into your AI assistant — one after the other. Each prompt handles a different part of the SEO setup. No downloads, no terminal, no ZIP files. Googlebot renders JavaScript, so this approach covers 95%+ of real search traffic.
Paste these three prompts into your AI assistant's chat one at a time. Wait for each one to finish before pasting the next. Your AI will ask you for your domain and site type — just answer and press Enter.
I need you to set up SEO on my React project so it is visible to search engines. This app runs as an SPA — there is no server-side rendering or static site generation available. STEP 1 — Install react-helmet-async: Install the package react-helmet-async. STEP 2 — Wrap the app in HelmetProvider: Open my main.tsx (or main.jsx). Wrap the entire app tree in <HelmetProvider> from react-helmet-async. Do NOT remove any existing providers — add HelmetProvider around them. STEP 3 — Set up index.html as the SEO baseline: My index.html needs these elements inside <head>. If any already exist, update them. If missing, add them: - <title> — Primary keyword + brand name, under 60 characters - <meta name="description"> — Compelling summary, 150-160 characters, includes primary keyword - <meta name="robots" content="index, follow"> - <link rel="canonical" href="https://DOMAIN"> - Open Graph tags: og:title, og:description, og:type, og:url, og:image, og:locale - Twitter tags: twitter:card, twitter:title, twitter:description, twitter:image Ask me for my website domain and primary keyword before filling these in. STEP 4 — Add JSON-LD structured data: Add a <script type="application/ld+json"> block in index.html with schema appropriate to my site type. Ask me what type of business/site this is, then use the correct schema (LocalBusiness, Organization, WebSite, Product, etc.). STEP 5 — Set up robots.txt: Create or update public/robots.txt with: User-agent: * Allow: / User-agent: GPTBot Allow: / User-agent: ClaudeBot Allow: / User-agent: Google-Extended Allow: / Sitemap: https://DOMAIN/sitemap.xml Replace DOMAIN with my actual domain. STEP 6 — Audit semantic HTML: Review my page components and ensure: - There is exactly ONE <h1> tag per page - Heading hierarchy is correct (h1 → h2 → h3, no skipping levels) - Content sections use semantic elements: <main>, <section>, <article>, <nav>, <header>, <footer> - Images have descriptive alt text - Links use meaningful anchor text (not "click here") Fix any issues you find. Show me what you changed.
Now add unique SEO meta tags to every page component in my app using react-helmet-async.
STEP 1 — Find all page routes:
Look at my App.tsx (or wherever routes are defined). List every route path and its corresponding page component.
STEP 2 — Add Helmet blocks to each page:
For every page component, add a <Helmet> block at the top of the JSX return. Each page needs:
import { Helmet } from "react-helmet-async";
<Helmet>
<title>Unique Page Title | Brand Name</title>
<meta name="description" content="Unique 150-160 char description for this specific page." />
<link rel="canonical" href="https://DOMAIN/page-path" />
<meta property="og:title" content="Same as title" />
<meta property="og:description" content="Same as description" />
<meta property="og:url" content="https://DOMAIN/page-path" />
<meta property="og:type" content="website" />
</Helmet>
Rules:
- Every page MUST have a unique title and description — no duplicates
- Titles should be under 60 characters and include relevant keywords
- Descriptions should be 150-160 characters
- The canonical URL must match the route path exactly
STEP 3 — Add page-specific JSON-LD where appropriate:
- Blog/article pages: Add Article schema with headline, datePublished, author
- Product pages: Add Product schema with name, price, availability
- FAQ pages: Add FAQPage schema
Show me the first page's result before doing the rest, so I can approve the style.Create a static sitemap.xml file for my site.
STEP 1 — Generate sitemap.xml:
Look at my App.tsx and find every route. Create a file at public/sitemap.xml with this format:
<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<url>
<loc>https://DOMAIN/</loc>
<lastmod>TODAY_DATE</lastmod>
<changefreq>weekly</changefreq>
<priority>1.0</priority>
</url>
<!-- One <url> block for each route -->
</urlset>
Rules:
- Homepage gets priority 1.0
- Main content pages get priority 0.8
- Secondary pages get priority 0.6
- Use today's date for lastmod in YYYY-MM-DD format
- Replace DOMAIN with my actual domain
- Do NOT include catch-all/404 routes
STEP 2 — Verify robots.txt:
Check that public/robots.txt contains a Sitemap: line pointing to https://DOMAIN/sitemap.xml. If not, add it.
STEP 3 — Remind me about Google Search Console:
After deployment, I will need to:
1. Go to https://search.google.com/search-console
2. Add and verify my domain
3. Submit my sitemap URL (https://DOMAIN/sitemap.xml)
4. Use the URL Inspection tool to request indexing of key pagesVerify it worked before you deploy
Once your AI has finished all three prompts, use the Google Rich Results Test to confirm the fix is in place before you deploy. Paste your site's preview URL (from Lovable or Base44) into the tool. If you see a screenshot of your fully-loaded page, Google can now read your content.
If the test shows a blank page or missing content, go back to Prompt 1 and check your AI answered the domain and site-type questions correctly.
What your AI just added to your project
react-helmet-async
Manages your page titles and meta tags
Per-page meta tags
Unique title and description on every page
Open Graph tags
Controls how your links look on social media
JSON-LD structured data
Helps Google understand your site type
robots.txt
Tells search engines and AI bots they're welcome
sitemap.xml
Lists every page so Google can find them all
None of this changes how your site looks or works for visitors. It only changes what search engines and AI tools can read. You're now ready to deploy.
npm run build, publish directory dist.Ctrl+Shift+G), write a short commit message, and click Publish to GitHub.netlify.toml file and configure the build automatically.Verify it worked: Once your site is live, go to the Google Rich Results Test and paste your URL. If you see a screenshot of your fully-loaded website, the fix worked. Google can now read your site.
You should also submit your sitemap to Google Search Console — see the "What's Next" section below.
This error only occurs with Track B (the npx react-seo-fix command), which adds a pre-rendering step that runs outside the browser. If you used Track A, you will not see this error. If your Netlify build fails, it is almost always the same issue — and it has a straightforward fix. Copy the prompt below and paste it into your AI assistant.
ReferenceError: window is not definedWhat it means: Part of your site is trying to use a feature that only works in a web browser, but the pre-build process runs outside the browser. Your AI assistant will find the cause and fix it automatically.
My Netlify build failed with the error "ReferenceError: window is not defined". I understand this is a common issue with this process.
Please look at the build log to see which component is causing the error. Then, open the scripts/prerender.mjs file and add a polyfill for the missing browser feature at the top of the file, before the import statements. Use this pattern:
if (typeof window === 'undefined') {
global.window = { innerWidth: 1200, innerHeight: 800, location: { href: '' } };
}
If the error is about a different browser feature (like document, navigator, or localStorage), add the appropriate property to the global.window object above.
After adding the polyfill, the build should succeed. Let me know what you changed.If your question isn't answered here, the troubleshooting prompt above will direct your AI assistant to diagnose and fix most issues automatically.
Getting indexed is the starting line, not the finish line. Here are six actions that will help Google understand your site and start showing it to the right people.