Przejdź do treści
ARDURA Lab
ARDURA Lab
·11 min

Technical SEO — Complete 2026 Checklist

technical SEOSEOCore Web Vitalsindexingchecklist
MG
Marcin Godula

CEO & Founder, ARDURA Lab

Specjalista SEO, GEO i web development z ponad 15-letnim doświadczeniem. Pomaga firmom B2B budować widoczność w wyszukiwarkach klasycznych i AI.

Technical SEO is the optimization of a website's infrastructure — speed, indexability, architecture, and code — so that search engines can effectively crawl, render, and index it.

What Is Technical SEO and Why Is It the Foundation?

Technical SEO is the foundation upon which everything else rests — content, linking, conversion optimization. You can have the best content in the world, but if Google can't index it or the page takes 8 seconds to load, no one will ever see it.

The analogy is simple: Technical SEO is like electrical wiring and plumbing in a house. Nobody sees them, but without them the house doesn't function. You can paint walls and buy furniture (content marketing, link building), but if there's no electricity and water, the house is useless.

In 2026, technical SEO is more important than ever. Websites are more complex (JavaScript frameworks, SPAs, PWAs), Google is more demanding (Core Web Vitals as a ranking factor), and the competition isn't sleeping.

This checklist contains 50+ points divided into 7 categories. You can use it for a self-audit or as a starting point for a professional SEO audit.


1. Crawlability — Can Google Reach Your Pages?

Crawlability is the ability of a search engine to discover and visit your site's pages. If Google can't reach a page, it won't index it — regardless of how good the content is.

Robots.txt

  • The robots.txt file exists and is accessible at /robots.txt
  • It doesn't block access to important resources (CSS, JS, images)
  • It doesn't block pages that should be indexed
  • It contains a link to the sitemap: Sitemap: https://yourdomain.com/sitemap.xml
  • It blocks pages that should NOT be indexed (admin panel, test versions, pages with parameters)

Common mistake: Developers add Disallow: / on staging and forget to remove it after deploying to production. One line blocks the entire site.

XML Sitemap

  • Sitemap XML exists and is valid (validation in Google Search Console)
  • Contains ONLY pages that should be indexed (status 200, no noindex)
  • <lastmod> dates reflect actual content changes (not the sitemap generation date)
  • Contains no more than 50,000 URLs (Google's limit — split into smaller files)
  • Is registered in Google Search Console
  • Is generated automatically when pages are added/removed

Crawl Budget

  • The site doesn't generate an excessive number of URLs (filters, sorting, pagination)
  • URL parameters that don't create unique content have noindex or canonical
  • Chain redirects (A → B → C) are shortened to direct ones (A → C)
  • No redirect loops (A → B → A)
  • Pages with 404 status are not internally linked

2. Indexing — Has Google Indexed the Right Pages?

Indexing is the process where Google analyzes a page's content and adds it to its index. A page that isn't in the index won't appear in search results.

Indexing Status

  • Coverage/Indexing report checked in Google Search Console
  • All important pages have "Valid" status (indexed)
  • Pages with "Excluded" status have a justified reason for exclusion
  • No unexpected pages with noindex
  • A site:yourdomain.com test returns the expected number of results

If you have indexing issues, check our guide on how to speed up indexing in Google.

Canonical Tags

  • Every page has a <link rel="canonical"> tag
  • Canonical URL points to itself (self-referencing) or to the canonical page
  • Canonical is consistent with HTTP/HTTPS version and with/without www
  • Pages with URL parameters have canonical to the version without parameters
  • Canonical is in <head>, not in <body>

Meta Robots

  • Pages to be indexed don't have the noindex tag
  • Private/technical pages have noindex, nofollow
  • No conflicts between robots.txt and meta robots (robots.txt blocks, but the page has no noindex — Google may still index it from another source)

3. URL Architecture and Site Structure

URL structure affects both SEO and user experience. Well-designed URLs communicate the topic of a page before the user clicks.

URLs

  • URLs are short, descriptive, and contain keywords
  • Use hyphens as separators (not underscores)
  • Don't contain session parameters, identifiers, or unnecessary directories
  • Consistent trailing slash (either all with / or all without)
  • No uppercase letters in URLs
  • UTF-8 encoding for special characters (or transliteration)

Navigation and Internal Linking

  • Every page is reachable within max 3 clicks from the homepage
  • Breadcrumbs with BreadcrumbList structured data
  • Main navigation is crawlable (not rendered solely by JavaScript)
  • No orphan pages (orphan pages — pages with no internal links)
  • Logical hierarchy: homepage → categories → subpages

Redirects

  • Old URLs have 301 redirects to new ones
  • No redirect chains (max 1 hop)
  • No 302 redirects where 301 (permanent) should be used
  • Pages after redesign/migration have complete URL mapping old → new

4. Core Web Vitals and Performance

Since 2021, Core Web Vitals have been a ranking factor. In 2026, requirements are even more stringent — Google rewards sites that offer excellent user experience. Learn more about the impact of CWV on rankings in the article on Core Web Vitals and positioning.

LCP (Largest Contentful Paint)

Target: under 2.5 seconds

  • The largest element (hero image, heading) loads in under 2.5s
  • Hero image has preload in <head>
  • Images are in modern formats (WebP, AVIF) with fallback
  • Critical CSS is inlined in <head>
  • Fonts have font-display: swap or optional
  • Server responds in under 600ms (TTFB)

INP (Interaction to Next Paint)

Target: under 200ms

  • Clicks and interactions respond in under 200ms
  • No long tasks (blocking JavaScript above 50ms) during loading
  • Event handlers don't perform heavy operations on the main thread
  • Third-party scripts load asynchronously

CLS (Cumulative Layout Shift)

Target: under 0.1

  • Images and video have defined dimensions (width and height)
  • Ads and embeds have reserved space
  • Fonts don't cause layout shifts
  • Dynamically loaded content doesn't push existing elements

General Performance

  • Lighthouse Performance score above 90 (mobile)
  • GZIP or Brotli compression enabled on the server
  • Browser caching configured (Cache-Control headers)
  • CDN serves static assets
  • Lazy loading for images below the fold
  • CSS and JavaScript minification
  • No unused CSS/JS (tree shaking, code splitting)

5. HTTPS and Security

HTTPS has been a ranking factor since 2014. In 2026, a site without HTTPS is a site that shouldn't exist in search results — Chrome marks it as "Not Secure."

  • The entire site is on HTTPS (not just login/cart)
  • SSL certificate is valid and hasn't expired
  • No mixed content (HTTP resources on an HTTPS page)
  • 301 redirect from HTTP to HTTPS (not 302)
  • Redirect from non-www to www version (or vice versa — consistently)
  • HSTS header enabled (Strict-Transport-Security)
  • Certificate covers all subdomains (wildcard or separate certs)

6. Structured Data (Schema.org)

Structured data helps Google understand a page's content and can generate rich snippets in search results — stars, FAQ, breadcrumbs, logo.

Required Schema

  • Organization — company name, logo, contact, social media
  • WebSite — with SearchAction for sitelinks searchbox
  • BreadcrumbList — on every subpage
  • Article / BlogPosting — on blog articles
  • LocalBusiness — if you have a physical location
  • FAQPage — on pages with a FAQ section

Optional Schema (but valuable)

  • HowTo — on step-by-step guides
  • Product — on product pages (e-commerce)
  • Review / AggregateRating — reviews and ratings
  • VideoObject — on pages with embedded video
  • SoftwareApplication — for SaaS products

Validation

  • Schema is in JSON-LD format (preferred by Google)
  • Validation in Google Rich Results Test — no errors
  • Validation in Schema.org Validator — no warnings
  • Schema doesn't contain content invisible to users (cloaking)

7. Mobile-First and Responsiveness

Google primarily indexes the mobile version of a site. If your site looks great on desktop but poor on mobile — Google sees the poor version.

  • Site is fully responsive (no separate m.yourdomain.com version)
  • Viewport meta tag: <meta name="viewport" content="width=device-width, initial-scale=1">
  • Text is readable without zooming (min 16px font-size)
  • Buttons and links have sufficient touch target (min 48x48px)
  • No horizontal scrolling on mobile
  • Mobile menu is crawlable (not blocked by JS)
  • Content is identical on mobile and desktop (don't hide content on mobile)
  • Pop-ups don't block content on mobile (interstitial penalty)

8. JavaScript Rendering

Modern websites often rely on JavaScript — React, Next.js, Vue, Angular. Google renders JavaScript, but with delays and not always perfectly.

  • Critical content is available without JavaScript (SSR or SSG)
  • Internal links use <a href> tags (not JavaScript onClick navigation)
  • Meta tags (title, description, canonical) are rendered server-side
  • Structured data is in HTML (not generated client-side)
  • "View Page Source" test shows full content (not an empty <div id="root">)
  • Google Search Console → URL Inspection → "View Crawled Page" shows correct content

Tools for a Technical SEO Audit

A self-conducted technical SEO audit requires the right tools. Here's a summary:

ToolFree?What it's for
Google Search ConsoleYesIndexing, crawl errors, Core Web Vitals, performance
Google PageSpeed InsightsYesCore Web Vitals, Lighthouse score
Chrome DevToolsYesRendering, performance, network, Lighthouse
Screaming FrogUp to 500 URLsSite crawl, meta tag analysis, redirects, hreflang
Ahrefs Site AuditNoComprehensive technical audit with prioritization
Schema ValidatorYesStructured data validation
Mobile-Friendly TestYesResponsiveness test
GTmetrixYesPerformance, waterfall, Core Web Vitals

Audit Workflow

  1. Crawl the site — Screaming Frog or Ahrefs Site Audit
  2. Check indexing — Google Search Console → Coverage
  3. Test performance — PageSpeed Insights on 5 key pages
  4. Validate schema — Rich Results Test on each page type
  5. Check mobile — Chrome DevTools → Device Mode at 3 resolutions
  6. Verify redirects — map of old URLs + test in Screaming Frog

Priorities — Where to Start?

You don't need to fix everything at once. Prioritize by impact on traffic:

Priority 1 — Critical (do immediately):

  • Pages with noindex that should be indexed
  • Blocks in robots.txt
  • 5xx and 4xx errors on important pages
  • No HTTPS
  • Chain redirects

Priority 2 — High (within a week):

  • Core Web Vitals below thresholds
  • No structured data
  • Duplicate canonicals
  • Broken internal links

Priority 3 — Medium (within a month):

  • Sitemap optimization
  • Orphan pages
  • Image optimization
  • Hreflang (if multilingual)

Priority 4 — Ongoing (monitoring):

  • Regular Coverage checks in GSC
  • Core Web Vitals monitoring
  • Schema validation after changes
  • Testing new pages before publishing

Summary

Technical SEO is not a one-time project — it's a continuous process. Algorithms change, the site evolves, new pages appear. We recommend a full technical audit every 6 months and continuous monitoring in Google Search Console.

This checklist is a starting point. If you want a professional SEO audit with prioritization and a remediation plan, contact us — we'll go through your site point by point and show you what to fix so Google starts loving you.

Terms from this article

Need help with this topic?

Get a free audit and find out how we can help your business grow online.

Get a free quote