Core Web Vitals — How Site Speed Affects Google Rankings
Core Web Vitals is a set of three Google metrics (LCP, INP, CLS) measuring loading speed, interactivity, and visual stability of a page.
Google Said It Outright: Speed = Rankings
In 2021, Google officially announced Core Web Vitals as a ranking factor. Not a suggestion. Not "one of many signals." An official ranking factor built into the algorithm.
What does this mean in practice? Two pages with identical content and links — the faster one wins.
And in 2026, when AI Overviews and SGE are changing the SERP, speed matters even more. Google increasingly selects fast pages for its snippets because users expect instant access.
The Three Core Web Vitals Metrics
1. LCP — Largest Contentful Paint
What it measures: The time from page entry to rendering the largest visible element on screen (hero image, headline, video).
Thresholds:
- 🟢 Good: <2.5 seconds
- 🟡 Needs improvement: 2.5-4 seconds
- 🔴 Poor: >4 seconds
Why it matters: LCP is the moment when the user sees the "real" page. Before LCP, they see a blank page or fragments. 53% of users leave a page if it takes longer than 3 seconds to load.
Common problems:
- Large hero images without optimization (5 MB PNG instead of 100 KB WebP)
- Slow server (TTFB >600ms)
- Render-blocking CSS/JS — the browser can't display the page because it's loading scripts
- Third-party scripts (chat, analytics, ads) blocking rendering
- No CDN — server in the US, user in Europe
How to fix:
- Images: WebP/AVIF, lazy loading (outside first viewport), preload hero image, srcset with responsive sizes
- Server: TTFB <200ms — consider CDN, edge rendering, SSG (Next.js generates static files)
- CSS: Inline critical CSS, async load the rest
- JS: Defer/async, code splitting, remove unused libraries
- Fonts:
font-display: swap, preload for critical fonts
2. INP — Interaction to Next Paint
What it measures: The time from a user interaction (click, tap, key press) to a visible page reaction. Replaced FID (First Input Delay) in March 2024.
Thresholds:
- 🟢 Good: <200 milliseconds
- 🟡 Needs improvement: 200-500 milliseconds
- 🔴 Poor: >500 milliseconds
Why it matters: INP measures responsiveness. You click a button — how long until something changes on screen? 200ms is the threshold below which an interaction feels "instant." Above that — you feel the delay.
Common problems:
- Heavy JavaScript blocking the main thread
- Event handlers with heavy computations
- Too many re-renders (React without memoization)
- Third-party scripts (chat widgets, analytics)
- Too deep DOM tree (>1,500 elements)
How to fix:
- Less JS: Code splitting, tree shaking, remove unused libraries
- Web Workers: Move heavy computations off the main thread
- React: useMemo, useCallback, React.memo, lazy loading components
- Debounce/throttle: event handlers, scroll listeners
- Third-party audit: Measure the impact of each external script
3. CLS — Cumulative Layout Shift
What it measures: The visual instability of a page. How many elements "jump" during loading.
Thresholds:
- 🟢 Good: <0.1
- 🟡 Needs improvement: 0.1-0.25
- 🔴 Poor: >0.25
Why it matters: The worst UX is trying to click a button that suddenly shifts 200px down because an ad banner loaded. CLS measures exactly this problem.
Common problems:
- Images and video without defined dimensions (width/height)
- Dynamically loaded ads and embeds
- Web fonts causing FOUT/FOIT (flash of unstyled/invisible text)
- Dynamically injected content (banners, cookie consent)
- CSS animations changing element dimensions
How to fix:
- Always define dimensions:
<img width="800" height="600">or aspect-ratio in CSS - Reserve space: Placeholder for dynamic content (ads, embeds)
- Font-display: optional or swap with a good fallback font
- Avoid insertBefore: Don't inject elements above existing content
- Transform animations: Use
transforminstead oftop/left/width/height
How to Measure Core Web Vitals
Tools with Real Data (Field Data)
Real User Monitoring — data from real users.
| Tool | Cost | What It Provides |
|---|---|---|
| Google Search Console | Free | CWV report, problematic URLs |
| PageSpeed Insights | Free | CrUX data + Lighthouse lab data |
| CrUX Dashboard | Free | Historical 28-day data |
| Chrome UX Report | Free | BigQuery, raw data |
Lab Tools (Lab Data)
Simulated tests — repeatable, controlled conditions.
| Tool | Cost | What It Provides |
|---|---|---|
| Lighthouse (Chrome DevTools) | Free | Full performance report |
| WebPageTest | Free | Waterfall, filmstrip, multi-location |
| Chrome DevTools Performance | Free | Flame chart, main thread analysis |
Continuous Monitoring
| Tool | Cost | What It Provides |
|---|---|---|
| Vercel Analytics | Free/Pro | Real-time CWV for Next.js |
| Sentry Performance | Free/Pro | Error tracking + performance |
| SpeedCurve | Paid | Continuous monitoring, alerts |
Pro tip: Field data > Lab data. Lighthouse in DevTools gives quick answers, but ranking is based on field data from CrUX.
CWV Impact on Rankings — Data
SearchPilot study (2024) on 500,000 pages:
- Pages with good CWV (all 3 green) have on average 12% higher CTR than pages with poor CWV
- Improving LCP from 4s to 2s correlates with a ranking improvement of 2-5 positions for medium-competition keywords
- Pages with CLS >0.25 have 15% higher bounce rate than pages with CLS <0.1
Note: CWV is a tiebreaker, not a game changer. If your content is 10x better than the competition, you'll win despite a slow site. But if content is comparable — CWV decides.
How to Quickly Improve CWV (Quick Wins)
5 minutes — immediate results:
- Add
widthandheightto all<img>(CLS) - Add
loading="lazy"to images outside the first viewport (LCP) - Add
font-display: swapin CSS (CLS)
30 minutes — significant improvement:
- Convert images to WebP (LCP — 25-35% size reduction)
- Move third-party scripts to
deferorasync(INP) - Inline critical CSS (LCP)
- Preload hero image and critical fonts (LCP)
A few hours — foundation:
- Implement a CDN (LCP — TTFB)
- Enable HTTP/2 or HTTP/3 (LCP)
- Audit and remove unused JS libraries (INP)
Major change — architectural:
- Migrate to SSG/SSR (Next.js) — LCP <1s as standard
- Image optimization pipeline (responsive images, AVIF)
- Edge rendering (Vercel, Cloudflare Workers)
CWV and Next.js
Why does Next.js win against most WordPress/Wix/Squarespace sites on CWV?
| Mechanism | CWV Effect |
|---|---|
| Static Site Generation (SSG) | LCP: HTML ready, zero server-side rendering |
| Automatic code splitting | INP: Only load JS needed for the current page |
| Next/Image | LCP + CLS: Automatic optimization, WebP, lazy loading, dimensions |
| Next/Font | CLS: Zero layout shift with fonts |
| Edge Runtime | LCP: Rendering close to user, TTFB <50ms |
| Streaming SSR | LCP: Progressive rendering — user sees content sooner |
A typical Next.js site with our optimizations: Lighthouse 95-100 on mobile. No tricks, no cache, no premium CDN.
CWV Checklist
LCP (<2.5s)
- Hero image in WebP/AVIF, <200 KB
- Preload hero image
- TTFB <200ms
- Critical CSS inline
- No render-blocking JS
INP (<200ms)
- Main thread not blocked >200ms
- Third-party scripts defer/async
- Event handlers optimized
- Code splitting active
CLS (<0.1)
- All images have width/height
- Fonts with font-display: swap/optional
- No dynamically injected content above the fold
- Ads/embeds with reserved space
Summary
Core Web Vitals is not a buzzword — it's a measurable ranking factor with concrete thresholds. A site with good CWV ranks higher, has a lower bounce rate, and a higher CTR.
The fastest path to green CWV? Modern technology (Next.js), optimized images, minimal JavaScript, and a CDN.
Want to check your site's CWV? Request a free performance audit — we'll measure and show you what to improve.