The right service for where you are right now.
From a 48-hour website launch to ongoing growth and automation. Start with what you need, add more when you are ready.
From a 48-hour website launch to ongoing growth and automation. Start with what you need, add more when you are ready.
Every industry gets a custom setup — not a template with your logo swapped in. Website, automation, and lead capture tailored to your vertical.
Audit, Fix, and Optimize Your Website's Technical Foundation for Maximum Search Performance
Master technical SEO for your local business website. Learn how to conduct comprehensive SEO audits, fix critical technical issues, optimize page speed for Core Web Vitals, and ensure search engines can crawl and index every page.
Your website is built, your pages are live, and your content is in place. Now it is time to make sure search engines can actually find, crawl, and understand every page you have created. If you have not completed the earlier phases, go back to Phase 3: website development or the full local SEO series overview. Technical SEO is the foundation that everything else sits on. Without it, even the best content in the world will struggle to rank.
Time Estimate: 2-4 hours (spread over multiple sessions)
Think of technical SEO as the plumbing behind the walls of a house. Visitors never see it, but nothing works properly without it. In our experience at Luminous Digital Visions, roughly 40% of local business websites we audit have critical technical issues that are actively suppressing their rankings. When we rebuilt Freshly Folded's laundry booking site, fixing schema markup and page speed alone moved them from buried in search results to #1 in multiple cities. Common problems include missing meta tags, broken schema markup, slow page loads, and duplicate content that confuses Google's crawlers.
This guide covers every technical optimization you need, with specific Claude prompts to implement each one. If you want to use Claude Code for the implementation, the prompts below are ready to paste. By the end, your site will be technically sound, fast, and ready to compete for top positions in local search results.
Before fixing anything, you need to know what is broken. A thorough SEO audit gives you a prioritized list of issues so you can tackle the highest-impact items first.
These are the fundamentals. If Google cannot crawl your pages, nothing else matters.
You do not need expensive tools for a thorough audit. Google's own Search Console documentation walks through the basics. These free options cover everything:
claude "Perform a comprehensive technical SEO audit on my local business website.
My website is built with Next.js 15 and deployed on Vercel.
Check these areas in detail:
1. CRAWLABILITY
- Does robots.txt exist at /public/robots.txt?
- Is there an XML sitemap being generated?
- Are there any orphan pages (pages not linked from anywhere)?
- Check for redirect chains or broken links
2. ON-PAGE SEO
- Review every page.tsx file for title tags and meta descriptions
- Check heading hierarchy (H1, H2, H3 usage)
- Verify all images have alt text
- Review URL structure for all routes
3. TECHNICAL ELEMENTS
- Check for canonical URL implementation
- Verify Open Graph tags exist on all pages
- Look for any duplicate content issues
- Check if JSON-LD schema markup is implemented
4. PERFORMANCE
- Review Next.js Image component usage
- Check for any large unoptimized assets
- Look for render-blocking resources
- Verify lazy loading is implemented
Create a prioritized list of every issue found, categorized as:
- CRITICAL (fix immediately)
- HIGH (fix this week)
- MEDIUM (fix this month)
- LOW (fix when convenient)
Include specific file paths and line numbers for each issue."
Tip: Run this audit every time you make significant changes to your site. In our experience at Luminous Digital Visions, new issues creep in surprisingly fast, especially after adding new pages or updating dependencies.
Schema markup is structured data that tells search engines exactly what your business is, what services you offer, and where you operate. When implemented correctly, it can qualify your pages for rich snippets (those enhanced search results with star ratings, business hours, FAQ dropdowns, and more).
Google uses schema to populate the Knowledge Panel, Local Pack, and rich results. The schema.org vocabulary defines exactly which properties search engines understand. A local business without schema is leaving money on the table. In our experience at Luminous Digital Visions, adding proper schema markup to a client's site increased their rich snippet appearances by over 300% within 6 weeks.
This goes on your homepage. It tells Google the essential details about your business. Make sure these details match your Google Business Profile exactly, since inconsistencies between your site and GBP listing confuse Google.
{
"@context": "https://schema.org",
"@type": "Plumber",
"name": "ABC Plumbing",
"image": "https://www.yoursite.com/logo.png",
"url": "https://www.yoursite.com",
"telephone": "+1-555-123-4567",
"email": "info@abcplumbing.com",
"address": {
"@type": "PostalAddress",
"streetAddress": "123 Main Street",
"addressLocality": "San Diego",
"addressRegion": "CA",
"postalCode": "92101",
"addressCountry": "US"
},
"geo": {
"@type": "GeoCoordinates",
"latitude": 32.7157,
"longitude": -117.1611
},
"openingHoursSpecification": [
{
"@type": "OpeningHoursSpecification",
"dayOfWeek": ["Monday", "Tuesday", "Wednesday", "Thursday", "Friday"],
"opens": "07:00",
"closes": "18:00"
},
{
"@type": "OpeningHoursSpecification",
"dayOfWeek": ["Saturday"],
"opens": "08:00",
"closes": "14:00"
}
],
"areaServed": [
{ "@type": "City", "name": "San Diego" },
{ "@type": "City", "name": "La Jolla" },
{ "@type": "City", "name": "Chula Vista" }
],
"priceRange": "$$",
"sameAs": [
"https://www.facebook.com/abcplumbing",
"https://www.yelp.com/biz/abc-plumbing-san-diego"
]
}
Info: Replace the @type value with your specific business type. Google supports dozens of local business subtypes including Electrician, Plumber, HVACBusiness, LocksmithService, RoofingContractor, and many more. Use the most specific type that matches your business.
Place this on each individual service page.
{
"@context": "https://schema.org",
"@type": "Service",
"name": "Drain Cleaning Service",
"description": "Professional drain cleaning and clog removal for residential and commercial properties in San Diego.",
"provider": {
"@type": "Plumber",
"name": "ABC Plumbing",
"url": "https://www.yoursite.com"
},
"areaServed": {
"@type": "City",
"name": "San Diego"
},
"serviceType": "Drain Cleaning"
}
Add this to any page with a FAQ section. It makes your questions and answers eligible to appear directly in search results as expandable dropdowns.
{
"@context": "https://schema.org",
"@type": "FAQPage",
"mainEntity": [
{
"@type": "Question",
"name": "How much does drain cleaning cost in San Diego?",
"acceptedAnswer": {
"@type": "Answer",
"text": "Drain cleaning in San Diego typically costs between $150 and $350 depending on the severity of the clog and the location of the blockage. We offer free estimates before starting any work."
}
},
{
"@type": "Question",
"name": "How long does drain cleaning take?",
"acceptedAnswer": {
"@type": "Answer",
"text": "Most drain cleaning jobs take 30 to 60 minutes. More severe blockages or main sewer line clogs may take 2 to 3 hours."
}
}
]
}
Breadcrumbs help both users and search engines understand the hierarchy of your site.
{
"@context": "https://schema.org",
"@type": "BreadcrumbList",
"itemListElement": [
{
"@type": "ListItem",
"position": 1,
"name": "Home",
"item": "https://www.yoursite.com"
},
{
"@type": "ListItem",
"position": 2,
"name": "Services",
"item": "https://www.yoursite.com/services"
},
{
"@type": "ListItem",
"position": 3,
"name": "Drain Cleaning",
"item": "https://www.yoursite.com/services/drain-cleaning"
}
]
}
This reinforces your brand identity across Google's Knowledge Graph.
{
"@context": "https://schema.org",
"@type": "Organization",
"name": "ABC Plumbing",
"url": "https://www.yoursite.com",
"logo": "https://www.yoursite.com/logo.png",
"contactPoint": {
"@type": "ContactPoint",
"telephone": "+1-555-123-4567",
"contactType": "customer service",
"areaServed": "US",
"availableLanguage": "English"
}
}
claude "Implement comprehensive JSON-LD schema markup across my entire Next.js website.
Add these schema types:
1. LOCALBUSINESS SCHEMA (homepage)
- Use the most specific @type for my business: [YOUR BUSINESS TYPE]
- Business name: [YOUR NAME]
- Address: [YOUR ADDRESS]
- Phone: [YOUR PHONE]
- Hours: [YOUR HOURS]
- Service areas: [YOUR CITIES]
- Include geo coordinates, price range, and social profiles
2. SERVICE SCHEMA (each service page)
- Service name and detailed description
- Provider information linked to LocalBusiness
- Area served
- Service type
3. FAQPAGE SCHEMA (every page that has a FAQ section)
- Extract all existing Q&A content
- Format as proper FAQPage schema
4. BREADCRUMBLIST SCHEMA (all pages)
- Home > Category > Page hierarchy
- Proper position numbering and URLs
5. ORGANIZATION SCHEMA (homepage or layout)
- Company name, logo, URL
- Contact point with phone and type
Implementation requirements:
- Use JSON-LD format (script tags in the head)
- Create a reusable SchemaMarkup component
- Validate output matches schema.org specifications
- Test with Google Rich Results Test after implementation"
Warning: Do not add schema markup for information that does not actually exist on the visible page. Google's guidelines require that structured data reflects actual page content. Adding fake reviews, fabricated hours, or services you do not offer can result in a manual penalty.
Page speed is a direct ranking factor. Google has been explicit about this since 2021 when Core Web Vitals became part of the ranking algorithm. Slow sites lose visitors too. 53% of mobile users abandon a page that takes longer than 3 seconds to load.
You need to hit these thresholds to pass Google's assessment:
| Metric | Good | Needs Improvement | Poor |
|---|---|---|---|
| LCP (Largest Contentful Paint) | < 2.5s | 2.5s - 4.0s | > 4.0s |
| INP (Interaction to Next Paint) | < 200ms | 200ms - 500ms | > 500ms |
| CLS (Cumulative Layout Shift) | < 0.1 | 0.1 - 0.25 | > 0.25 |
LCP measures how long it takes for the biggest visible element (usually a hero image or heading) to load. INP measures how responsive the page feels during user interactions. CLS measures how much the layout shifts around as elements load in.
The single biggest performance win for most sites is proper image handling. The Next.js Image component handles optimization automatically.
import Image from 'next/image'
// Instead of <img src="/hero.jpg" />
<Image
src="/hero.jpg"
alt="Professional plumbing services in San Diego"
width={1200}
height={600}
priority // Only use for above-the-fold images
placeholder="blur"
blurDataURL="data:image/jpeg;base64,/9j/4AAQ..."
/>
The priority prop should only go on your largest above-the-fold image (typically the hero image). Every other image below the fold should lazy load by default, which the Next.js Image component does automatically.
Next.js splits code automatically by route, but you can go further with dynamic imports for heavy components.
import dynamic from 'next/dynamic'
// Load the map component only when needed
const GoogleMap = dynamic(() => import('@/components/GoogleMap'), {
loading: () => <div className="h-64 bg-gray-100 animate-pulse" />,
ssr: false,
})
This keeps your initial bundle small and loads heavy components only when the user actually needs them.
Fonts are a common source of layout shift and slow loads. Next.js has built-in font optimization.
import { Inter } from 'next/font/google'
const inter = Inter({
subsets: ['latin'],
display: 'swap',
preload: true,
})
Using next/font eliminates the external network request to Google Fonts and prevents the flash of unstyled text that causes CLS issues.
claude "Optimize my Next.js website for maximum page speed. Target 90+ on PageSpeed Insights.
Current setup: Next.js 15, Tailwind CSS, deployed on Vercel.
Implement these optimizations:
1. IMAGE OPTIMIZATION
- Replace all <img> tags with Next.js Image component
- Add priority prop to above-the-fold hero images only
- Add blur placeholders for all images
- Verify all images have width and height set
2. CODE SPLITTING
- Dynamic import any components over 50KB
- Lazy load the Google Maps embed
- Lazy load any contact form modals
- Defer loading of analytics scripts
3. FONT OPTIMIZATION
- Use next/font for all Google Fonts
- Set display: swap to prevent layout shift
- Preload only the font weights actually used
4. CSS OPTIMIZATION
- Purge unused Tailwind classes in production
- Verify tailwind.config.ts content paths are correct
- Inline critical CSS for above-the-fold content
5. RENDERING STRATEGY
- Use Static Site Generation (SSG) for all pages that do not change often
- Verify pages are not accidentally using client-side rendering
- Check for unnecessary 'use client' directives
6. THIRD-PARTY SCRIPTS
- Load Google Analytics with next/script strategy='afterInteractive'
- Defer all non-critical third-party scripts
- Remove any unused scripts or dependencies
Run a before-and-after comparison of bundle sizes."
Images typically account for 50-70% of a web page's total size. Optimizing them is often the single fastest way to improve your page speed score. Your content enhancement strategy will add more images later, so getting the optimization pipeline right now saves time.
WebP images are 25-35% smaller than JPEG at equivalent quality and support transparency like PNG. Every modern browser supports WebP. There is no reason to serve JPEG or PNG to users in 2026.
If you are using the Next.js Image component, WebP conversion happens automatically in production. But if you are placing images in the /public folder for direct use, convert them yourself.
| Image Type | Recommended Quality | Typical File Size |
|---|---|---|
| Hero/Banner images | 80-85% quality | 80-150 KB |
| Service page images | 75-80% quality | 40-80 KB |
| Thumbnails | 70-75% quality | 15-30 KB |
| Icons and logos | PNG or SVG | 5-15 KB |
Do not serve a 2000px-wide image to a mobile phone with a 375px-wide screen. The Next.js Image component handles this with the sizes prop.
<Image
src="/service-hero.jpg"
alt="Drain cleaning service in San Diego"
width={1200}
height={600}
sizes="(max-width: 768px) 100vw, (max-width: 1200px) 50vw, 1200px"
/>
This tells the browser to request an appropriately sized image based on the viewport width, which can reduce image download sizes by 50-70% on mobile devices.
Alt text serves two purposes: accessibility for screen reader users and SEO signals for search engines.
Good alt text:
alt="Licensed plumber repairing kitchen sink drain in San Diego home" — Descriptive, includes keyword naturallyalt="Before and after drain cleaning results" — Describes the image contentBad alt text:
alt="image1" — Tells nobody anythingalt="plumber plumbing San Diego plumber San Diego plumbing services" — Keyword stuffingalt="" — Empty alt text (acceptable only for decorative images)claude "Audit and optimize every image on my Next.js website.
For each image, ensure:
1. FORMAT
- Next.js Image component used (not raw <img> tags)
- WebP conversion enabled for production
- SVG used for icons and logos where appropriate
2. SIZING
- Width and height attributes set on every image
- Responsive sizes prop configured for different viewports
- No images larger than necessary for their display size
3. LOADING
- Priority prop on above-the-fold hero images only
- All other images lazy loaded (default behavior)
- Blur placeholders added for better perceived performance
4. ALT TEXT
- Every image has descriptive, keyword-relevant alt text
- Alt text describes the actual image content
- No keyword stuffing in alt attributes
- Decorative images use alt='' with role='presentation'
5. FILE SIZES
- Flag any image over 200KB
- Suggest compression for oversized images
- Recommend removing any unused images from /public
List every image found, its current status, and what needs to change."
Clean URLs help both search engines and users understand what a page is about before they even visit it. They are also a minor ranking factor. The URL structure you defined during Phase 3: website development should already follow these patterns, but it is worth verifying.
/Services/Drain-Cleaning should be /services/drain-cleaning/drain-cleaning is correct. /drain_cleaning is not./services/drain-cleaning is better than /services/service-3/services/residential-drain-cleaning-and-repair-service is too long./services/drain-cleaning, /locations/la-jollahttps://www.yoursite.com/services/drain-cleaning
https://www.yoursite.com/services/water-heater-repair
https://www.yoursite.com/locations/la-jolla
https://www.yoursite.com/locations/san-diego
https://www.yoursite.com/about
https://www.yoursite.com/contact
The robots.txt file lives at the root of your site and tells search engine crawlers which pages to crawl and which to skip.
# robots.txt for a Next.js local business website
User-agent: *
Allow: /
# Block admin and API routes
Disallow: /api/
Disallow: /_next/
Disallow: /admin/
# Reference your sitemap
Sitemap: https://www.yoursite.com/sitemap.xml
In Next.js 15 with the App Router, create this at app/robots.ts:
import { MetadataRoute } from 'next'
export default function robots(): MetadataRoute.Robots {
return {
rules: {
userAgent: '*',
allow: '/',
disallow: ['/api/', '/_next/', '/admin/'],
},
sitemap: 'https://www.yoursite.com/sitemap.xml',
}
}
Your sitemap tells Google about every page on your site and how important each one is. In Next.js 15, you can generate it programmatically:
// app/sitemap.ts
import { MetadataRoute } from 'next'
export default function sitemap(): MetadataRoute.Sitemap {
const baseUrl = 'https://www.yoursite.com'
return [
{ url: baseUrl, lastModified: new Date(), changeFrequency: 'weekly', priority: 1.0 },
{ url: `${baseUrl}/about`, lastModified: new Date(), changeFrequency: 'monthly', priority: 0.8 },
{ url: `${baseUrl}/contact`, lastModified: new Date(), changeFrequency: 'monthly', priority: 0.8 },
{ url: `${baseUrl}/services`, lastModified: new Date(), changeFrequency: 'weekly', priority: 0.9 },
{ url: `${baseUrl}/services/drain-cleaning`, lastModified: new Date(), changeFrequency: 'monthly', priority: 0.8 },
// Add all your service pages
{ url: `${baseUrl}/locations/san-diego`, lastModified: new Date(), changeFrequency: 'monthly', priority: 0.8 },
// Add all your location pages
]
}
claude "Review and fix the URL structure, robots.txt, and sitemap for my Next.js website.
1. URL AUDIT
- Check all route folders use lowercase with hyphens
- Verify URLs include target keywords
- Flag any URLs that are too long or use underscores
- Ensure consistent hierarchy (services/[slug], locations/[slug])
2. ROBOTS.TXT
- Create app/robots.ts using Next.js MetadataRoute
- Allow all public pages
- Block API routes, _next, and any admin routes
- Include sitemap reference with full production URL
3. XML SITEMAP
- Create app/sitemap.ts using Next.js MetadataRoute
- Include every public page with correct priority:
- Homepage: 1.0
- Service overview: 0.9
- Individual services: 0.8
- Location pages: 0.8
- About/Contact: 0.7
- Set appropriate changeFrequency for each page type
- Use the production domain: [YOUR DOMAIN]
4. CANONICAL URLS
- Verify every page has a canonical URL in metadata
- Ensure canonical URLs use the production domain (not localhost)
- Check for any duplicate content issues
Implement all changes and verify the sitemap is accessible at /sitemap.xml."
Tip: After deploying these changes, submit your sitemap to Google Search Console. Go to Search Console, click "Sitemaps" in the left menu, enter your sitemap URL, and click "Submit." Google will start crawling your pages within 24-48 hours.
Technical SEO refers to optimizations that help search engines crawl, index, and understand your website. It includes things like page speed, schema markup, meta tags, sitemaps, and URL structure. Without proper technical SEO, even excellent content may not rank because search engines struggle to process it.
Visit pagespeed.web.dev and enter your website URL. The tool will analyze both mobile and desktop versions and give you a score from 0 to 100, along with specific recommendations. Aim for 90 or above on both mobile and desktop.
Schema markup is code that helps search engines understand your content. For local businesses, it tells Google your business name, address, phone number, hours, services, and more. Without it, you are leaving rich snippet opportunities on the table. Sites with proper schema markup are 2-3 times more likely to appear in enhanced search results.
Run a full audit at least once per quarter. After any major site changes — adding new pages, redesigning sections, or updating dependencies — run a focused audit on the affected areas. Google's algorithms and best practices evolve, so regular audits catch new issues before they impact rankings.
Core Web Vitals are three specific metrics Google uses to measure user experience: LCP (how fast the main content loads), INP (how responsive the page feels during interactions), and CLS (how much the layout shifts during loading). Google has confirmed these are ranking factors, and sites that pass all three thresholds have a measurable ranking advantage.
Yes. WebP provides 25-35% smaller file sizes at the same visual quality as JPEG. It also supports transparency (like PNG) and animation (like GIF). Every major browser has supported WebP since 2020. The Next.js Image component converts to WebP automatically in production, so you get the benefit without extra work.
For most local business websites, your robots.txt should allow all public pages, block API routes and internal framework files, and reference your XML sitemap location. Keep it simple. The most common mistake is accidentally blocking important pages, which prevents Google from indexing them.
Use Google's Rich Results Test at search.google.com/test/rich-results. Paste your page URL and the tool will show you exactly which schema types it detects and whether they are valid. Fix any errors or warnings before moving on. You can also use the Schema Markup Validator at validator.schema.org for more detailed validation.
Google sometimes rewrites title tags in search results if it thinks its version better matches the search query. This happens most often when title tags are too long, do not match page content, or are stuffed with keywords. The fix is to write concise, accurate title tags under 60 characters that genuinely describe the page content.
Meta descriptions do not directly affect rankings — Google has confirmed this. However, they strongly influence click-through rates. A compelling meta description with a clear call-to-action can increase clicks by 20-35% compared to a generic or missing description. Higher click-through rates send positive signals to Google, which can indirectly help rankings.
Crawl budget is the number of pages Google will crawl on your site within a given timeframe. For most local business websites with under 500 pages, crawl budget is not a concern — Google will crawl everything without issue. It becomes relevant for large sites with tens of thousands of pages. That said, keeping your site free of unnecessary URLs (duplicate pages, parameter-based URLs, old drafts) ensures Google spends its crawl budget on your important pages rather than wasting time on low-value ones.
A canonical tag is an HTML element that tells search engines which version of a page is the "official" one. You need them when the same content is accessible at multiple URLs — for example, if yoursite.com/services and yoursite.com/services/ both resolve to the same page, or if URL parameters create duplicate versions like yoursite.com/services?ref=email. In Next.js, set canonical URLs in your metadata export for every page to point to the clean, preferred URL.
Hreflang is an HTML attribute that tells Google which language and regional version of a page to serve to users in different locations. Most local businesses serving a single language in a single region do not need hreflang. You would need it if your site serves content in multiple languages (for example, English and Spanish versions of every page) or if you have region-specific versions targeting different countries. If you do serve a bilingual community, implementing hreflang correctly prevents Google from treating your Spanish and English pages as duplicate content.
A hard 404 is a page that returns a proper 404 HTTP status code, correctly telling search engines the page does not exist. A soft 404 is a page that returns a 200 OK status code but displays "page not found" content — effectively lying to search engines by saying the page is fine when it is not. Soft 404s are worse because Google wastes crawl budget on them and may keep them indexed. Check Google Search Console under the "Pages" report to find soft 404 issues, and fix them by returning a proper 404 status code or redirecting to a relevant page.
A redirect chain occurs when a URL redirects to another URL, which then redirects to yet another URL. For example, Page A redirects to Page B, which redirects to Page C. Each redirect in the chain adds latency (typically 100-300ms per hop) and loses a small amount of link equity. Google will follow up to about 10 redirects, but the more hops in the chain, the less authority reaches the final destination. Fix redirect chains by updating each redirect to point directly to the final destination URL.
Render-blocking resources are CSS stylesheets and JavaScript files that prevent the browser from displaying content until they are fully downloaded and processed. When PageSpeed Insights flags render-blocking resources, it means your page cannot paint anything until those files load. Fix this by deferring non-critical JavaScript with async or defer attributes, inlining critical CSS for above-the-fold content, and using next/dynamic for heavy components. In Next.js, the framework handles much of this automatically, but third-party scripts and custom CSS files are common offenders.
Lazy loading defers the loading of off-screen content until the user scrolls near it. For images, the Next.js Image component lazy loads by default — only add the priority prop to your above-the-fold hero image. For other content, use next/dynamic to lazy load heavy components like maps, video players, and contact form modals. Avoid lazy loading anything visible in the initial viewport, as this actually hurts LCP scores. Also avoid lazy loading critical navigation elements or content that search engines need to see immediately.
Use three tools in sequence. First, Google's Rich Results Test (search.google.com/test/rich-results) checks whether your structured data qualifies for rich results and flags errors. Second, the Schema Markup Validator (validator.schema.org) provides more granular validation against the full schema.org specification. Third, after deployment, monitor the "Enhancements" section in Google Search Console to see which rich result types Google has detected and any ongoing issues. Test after every schema change, not just after the initial implementation.
Mobile-first indexing means Google primarily uses the mobile version of your site for indexing and ranking, not the desktop version. This has been the default for all websites since 2023. In practical terms, it means your mobile site must have the same content, structured data, and meta tags as your desktop version. If content is hidden behind "read more" toggles or tabs on mobile, Google still indexes it. But if your mobile version serves completely different or reduced content compared to desktop, you could lose rankings. Test your site using Chrome DevTools mobile emulation to verify content parity.
AMP (Accelerated Mobile Pages) is no longer a requirement for appearing in Google's Top Stories carousel or any other search feature. Google removed the AMP requirement in 2021. While AMP pages still work and load quickly, they come with significant development overhead and restrictions on JavaScript and CSS. For most local businesses, a well-optimized Next.js site with good Core Web Vitals scores will perform just as well or better in search results without the limitations of AMP. Invest your time in optimizing your regular pages instead.
Google can render JavaScript, but it is a two-phase process. In the first phase, Google crawls and indexes the raw HTML. In the second phase (which can be delayed by hours or days), Google renders the JavaScript and indexes the dynamically generated content. For local business sites using Next.js with Static Site Generation or Server-Side Rendering, this is not a concern because the HTML is fully rendered before it reaches Google. However, if you are using client-side rendering (pages that load as empty shells and fill in with JavaScript), critical content may not be indexed promptly. Use View Source in your browser to check — if your content is visible in the raw HTML, Google can index it immediately.
Google has publicly stated that it ignores the priority and changeFrequency values in XML sitemaps. However, setting them does not hurt, and some other search engines (like Bing) may still reference them. The most useful field in your sitemap is lastModified, which tells search engines when a page was last updated. Set this to the actual date of the last meaningful content change — not the current date on every build — so Google knows which pages have genuinely new content worth recrawling.
Beyond Allow and Disallow, there are a few useful directives. Crawl-delay asks bots to wait a specified number of seconds between requests (Google ignores this, but Bing respects it). Sitemap points crawlers to your XML sitemap. You can also target specific bots with User-agent directives — for example, blocking AI training crawlers like GPTBot or CCBot while allowing Googlebot. Never block CSS or JavaScript files in robots.txt, as Google needs them to render your pages correctly for mobile-first indexing.
You should now have:
In the next phase, you will enhance your content depth, analyze competitors, implement semantic SEO with semantically related keywords, and build E-E-A-T signals that establish your authority. After that, advanced optimization strategies and the monitoring and analytics guide will help you maintain and grow your rankings. If you need professional help with any of these phases, our web development services team handles technical SEO implementation daily.
Continue to Phase 5 when ready.
Build a professional local business website using Next.js and modern web development practices. Complete guide with AI-powered development, responsive design, and conversion optimization.
Transform your local business website content from thin pages to comprehensive, authoritative resources. Learn content depth enhancement, competitor analysis, and semantic SEO strategies.
Master advanced SEO optimization strategies including internal linking architecture, comprehensive schema markup implementation, and content refresh schedules to maintain and improve rankings.
Our team at Luminous Digital Visions specializes in SEO, web development, and digital marketing. Let us help you achieve your business goals.
Get Free Consultation