Skip to content
ARDURA Lab
ARDURA Lab
·10 min

Google Indexing API — complete tutorial with ready-to-use script [2026]

Indexing APIGoogle Search Consoleindexingtechnical SEOPython
MG
Marcin Godula

CEO & Founder, ARDURA Lab

Specjalista SEO, GEO i web development z ponad 15-letnim doświadczeniem. Pomaga firmom B2B budować widoczność w wyszukiwarkach klasycznych i AI.

Indexing API is Google's official tool for force-crawling URLs — in practice it works for any type of site (despite documentation suggesting "JobPosting only"). This tutorial walks through setup from scratch in 10 minutes, a ready-to-use Python script (~30 lines), and a case study of 109 URLs from a parenting blog where 100% of submissions succeeded.

TL;DR — Indexing API Google in 5 points

#WhatDetails
1. SetupGCP project + enable Indexing API + Service Account5 min
2. GSCService Account email as Owner (NOT Full) in property1 min
3. Quota200 requests/day per projectworkaround: spread across days
4. EndpointurlNotifications.publish with {url, type: URL_UPDATED}scope indexing
5. RealitySupports every URL despite "officially JobPosting only"verified on 109 URLs

Get an SEO audit → if your site has >100 unindexed URLs and you need a force-crawl strategy.


What Indexing API is (and why official documentation is misleading)

Indexing API is Google's programmatic interface for notifying Googlebot that a URL has been updated and should be crawled. It works analogously to the URL Inspection feature in Google Search Console, but in bulk mode — you can submit 200 URLs daily instead of clicking individuals manually.

Official scope vs real scope

Google's documentation declares:

"The Indexing API allows any site owner to directly notify Google when pages are added or removed. Currently, the Indexing API can only be used to crawl pages with either job posting or broadcast event markup."

This quote discourages most teams from using the API for blogs, e-commerce, or service pages. In practice, the urlNotifications.publish endpoint accepts every URL and responds 200 OK. Google does not enforce at runtime that the submitted URL must have JobPosting/BroadcastEvent schema.

Empirical data (case study dzieckologia.pl, May 2026)

In our session for the parenting blog dzieckologia.pl (Astro 4.16, 53 PL articles + 53 EN, fresh domain without authority), we submitted 109 URLs of parenting articles + pillar pages via Indexing API:

  • Success: 109/109 (100%)
  • Content type: parenting blog posts (Article schema, NOT JobPosting)
  • Endpoint responses: all 200 OK with urlNotificationMetadata
  • Outcome after 14 days: indexed URL growth from 16% (20/128) to be measured — measurement scheduled for 2026-05-17

Empirical conclusion: Indexing API is a real force-crawl tool for any site — not just job postings. The official disclaimer in the documentation is likely a legal safeguard, not a technical limitation.

When does Indexing API make sense?

Use caseWorth it?Why
Fresh domain (<6 months)✅ YESNo crawl budget, sitemap insufficient
After migration / rebrand✅ YES100+ new URLs at once
After publishing new blogs✅ YESSame-day force-crawl
Single updates (1-3 URLs)❌ NOURL Inspection in GSC is faster
Already indexed URLs (refresh)⚠️ CarefullyAPI is for new content, not multi-week refreshing

Setup in 10 minutes — step by step

Step 1: Google Cloud Console — project + API

  1. Open console.cloud.google.com
  2. Create a new project (e.g., seo-indexing-tools)
  3. From the menu choose APIs & ServicesLibrary
  4. Search for Indexing API and click Enable
  5. You'll be redirected to the API panel — that confirms enablement

Time: ~3 min.

Step 2: Service Account + JSON key

  1. Now go to APIs & ServicesCredentials
  2. Click Create CredentialsService Account
  3. Name it e.g., seo-indexing-bot, description optional
  4. Skip role assignment (we don't need a GCP-level role)
  5. After creation, click on the Service Account → KeysAdd KeyJSON
  6. The JSON file downloads automatically. Store it as gsc-service-account.json in a safe location (DO NOT commit to git)

Time: ~2 min. Note: The Service Account email format is [email protected] — save it, you'll use it in step 3.

Step 3: Google Search Console — Service Account as Owner

This is the most common setup mistake — most tutorials skip this step or incorrectly suggest "Full" instead of "Owner".

  1. Open search.google.com/search-console
  2. Select the property you want to use the API for (e.g., arduralab.com)
  3. Go to Settings (gear icon) → Users and permissions
  4. Click Add user
  5. Enter the Service Account email from step 2
  6. IMPORTANT: select Owner, not Full (Full is insufficient for Indexing API)
  7. Confirm

Time: ~1 min. After confirmation, the Service Account has permission to submit URLs from this property.

Step 4: Test setup — first request

from google.oauth2 import service_account
from googleapiclient.discovery import build

CREDENTIALS_PATH = "/path/to/gsc-service-account.json"
SCOPES = ["https://www.googleapis.com/auth/indexing"]

credentials = service_account.Credentials.from_service_account_file(
    CREDENTIALS_PATH, scopes=SCOPES
)
service = build("indexing", "v3", credentials=credentials)

response = service.urlNotifications().publish(body={
    "url": "https://your-domain.com/blog/test-url",
    "type": "URL_UPDATED"
}).execute()

print(response)

Expected response:

{
  "urlNotificationMetadata": {
    "url": "https://your-domain.com/blog/test-url",
    "latestUpdate": {
      "url": "https://your-domain.com/blog/test-url",
      "type": "URL_UPDATED",
      "notifyTime": "2026-05-03T20:14:32.123456Z"
    }
  }
}

Most common errors:

  • 403 Permission denied → Service Account isn't Owner in GSC (step 3)
  • 404 Indexing API not enabled → API not enabled in GCP (step 1)
  • 400 Invalid argument → Invalid URL (must be full https://, no fragment #)

Ready-to-use Python script (bulk submission with rate limiting)

A script for submitting N URLs from a text file, with rate limiting, retry logic, and JSON logging.

#!/usr/bin/env python3
"""Bulk Indexing API request — pattern for 200 URLs/day quota."""
import json
import sys
import time
from datetime import date
from pathlib import Path

from google.oauth2 import service_account
from googleapiclient.discovery import build
from googleapiclient.errors import HttpError

CREDENTIALS_PATH = Path("/path/to/gsc-service-account.json")
SCOPES = ["https://www.googleapis.com/auth/indexing"]
RATE_LIMIT_SLEEP = 0.5  # 0.5s between requests = ~120 req/min
DAILY_QUOTA = 200

def submit_url(service, url: str) -> dict:
    """Submit 1 URL with retry on 5xx."""
    body = {"url": url, "type": "URL_UPDATED"}
    for attempt in range(5):
        try:
            return service.urlNotifications().publish(body=body).execute()
        except HttpError as e:
            if e.resp.status >= 500:
                time.sleep(2 ** attempt)
                continue
            raise
    raise RuntimeError(f"Failed after 5 retries: {url}")

def main(urls_file: str) -> None:
    urls = Path(urls_file).read_text().splitlines()
    urls = [u.strip() for u in urls if u.strip() and not u.startswith("#")]

    if len(urls) > DAILY_QUOTA:
        print(f"WARN: {len(urls)} URLs > daily quota {DAILY_QUOTA}. Truncating.")
        urls = urls[:DAILY_QUOTA]

    creds = service_account.Credentials.from_service_account_file(
        CREDENTIALS_PATH, scopes=SCOPES
    )
    service = build("indexing", "v3", credentials=creds, cache_discovery=False)

    results = []
    for i, url in enumerate(urls, 1):
        print(f"[{i}/{len(urls)}] {url}")
        try:
            response = submit_url(service, url)
            results.append({"url": url, "status": "OK", "response": response})
        except Exception as e:
            results.append({"url": url, "status": "ERROR", "error": str(e)})
        time.sleep(RATE_LIMIT_SLEEP)

    log_path = Path(f"indexing-log-{date.today().isoformat()}.json")
    log_path.write_text(json.dumps(results, indent=2))
    ok_count = sum(1 for r in results if r["status"] == "OK")
    print(f"\nDone: {ok_count}/{len(urls)} OK. Log: {log_path}")

if __name__ == "__main__":
    main(sys.argv[1] if len(sys.argv) > 1 else "urls.txt")

urls.txt file format

https://your-domain.com/blog/post-1
https://your-domain.com/blog/post-2
# comments are ignored
https://your-domain.com/services/seo

Run

python3 indexing_bulk.py urls.txt

Output: JSON log with all requests and responses. You can parse the log into a dashboard or simply check which URLs returned errors.


When Indexing API isn't enough

The API is not a magic bullet. It opens the door to crawling, doesn't guarantee ranking or indexation.

Situations where Indexing API won't help

  1. Page has noindex in meta robots — Google crawls but won't index. First fix <meta name="robots">.
  2. Robots.txt blocks the URL — Googlebot doesn't crawl at all. Check robots.txt.
  3. Thin content — Google crawls, decides it's not worth indexing. Expand content to >800 substantive words.
  4. Duplicate content — Google ignores canonical conflicts. Configure <link rel="canonical">.
  5. Domain without authority — fresh domains without backlinks often have long crawl delays. Indexing API shortens delay but doesn't eliminate it. See our backlink strategy guide for details.
  6. Submitting all URLs in one day — Google rate-limits crawl even after API signal. Spread 1000+ URLs across 7-14 days for effective crawling.

Tools complementary to Indexing API

ToolGoalComplement to Indexing API
Sitemap.xmlGlobal URL map100% — sitemap says GO, API says NOW
GSC URL InspectionSingle URLs + diagnostics100% — different use cases
Internal linkingCrawl path for new URLs100% — internal links facilitate natural crawl
External backlinksAuthority + crawl trigger100% — backlinks accelerate crawl independently
IndexNow (Bing/Yandex)Notification for Bing100% — complementary for Bing

FAQ — most common questions

Can I use Indexing API for a client's site?

Yes. The Service Account must be Owner in the client's GSC property. This requires client access or authorization. You can have 1 GCP project handling 50+ clients (1 SA, 50 properties, 200 quota shared).

Will Indexing API help after a negative SEO attack?

Partially. After URLs suddenly disappear from the index (e.g., after massive 4xx errors or manual penalty) — first fix the cause (4xx, content quality, manual action), then submit via Indexing API as URL_UPDATED. The API doesn't fix content issues, only triggers re-crawl.

What is type URL_DELETED?

urlNotifications.publish also accepts type: URL_DELETED — informs Google that a URL has been removed. It's an alternative to a 410 Gone status code and <meta name="robots" content="noindex">. Practical use case: after deleting old blogs in bulk (>50 URLs), you want to speed up their de-indexation.

Does Indexing API count in GSC as "Submitted via sitemap"?

No — in GSC Coverage Report, a URL submitted via Indexing API shows as "Submitted and indexed" if indexed, or "Discovered — currently not indexed" if Google crawled but didn't decide to index. Sitemap submission is a separate mechanism — best to use both complementarily.

How to monitor Indexing API effectiveness?

3 signals:

  1. Direct — JSON response from API contains latestUpdate.notifyTime (when Google accepted the signal)
  2. GSC URL Inspection — check URL status after 7-14 days (PASS = indexed, NEUTRAL = still not crawled)
  3. GSC Performance — whether URLs start collecting impressions (= ranking)

Fastest signal #1 → slower #2 → slowest #3. Full cycle: 1-21 days for most sites.

Can I submit the same URL weekly to "boost" ranking?

No. Multi-submission of the same URL in a short time doesn't boost ranking. The API makes sense for new or substantially changed content. Spam-submission wastes quota and may lower trust signal.


What's next

  1. General setup — perform steps 1-3 once, even if you're not using the API yet. Setup takes 10 minutes, you have it ready for every future deploy.
  2. Integrate into deploy pipeline — after git push automatically call Indexing API for new URLs. CI/CD step in 5 lines of bash + Python script above.
  3. Re-check after 14 days — compare GSC Coverage before and after, measure indexed URL delta.
  4. Read more:

Get an SEO audit

Your site has 100+ URLs in sitemap, but only 20% indexed? Get an SEO audit → — we diagnose causes (crawl budget, content quality, robots.txt, canonical) and implement a force-indexing strategy through Indexing API + complementary tools.

Terms from this article

Need help with this topic?

Get a free audit and find out how we can help your business grow online.

Get a free quote