googlemigrationsearch-api

Google CSE Shutdown: Developer Impact and Migration

Google PSE removes web-wide search by Jan 2027. Migration paths from free 100/day tier to paid SERP APIs with richer data.

6 min

Google Custom Search Engine (Programmable Search Engine) is removing web-wide search capability by January 2027, forcing developers who relied on it for general web queries to migrate to paid SERP APIs. The free 100 queries/day tier that powered thousands of side projects and internal tools is effectively dead for anything beyond site-restricted search.

What exactly is changing

Google PSE will continue to work for searching within specific domains you configure, but the "search the entire web" toggle is being deprecated. Any application using the cx parameter with whole-web search enabled will stop returning results. Google has not announced a replacement tier.

Who this breaks

  • Internal research tools using PSE for general web queries
  • AI agents using Google CSE as their web search backend
  • Side projects and MVPs built on the free 100/day tier
  • n8n and Zapier workflows calling PSE HTTP nodes
  • Open source projects shipping PSE as the default search provider

Migration options compared

Pricing for the equivalent of PSE free tier (100 queries/day, ~3K/month):

  • Brave Search API: $5 free credit/mo covers ~1K queries, then $5/1K
  • Serper: 2,500 free/mo (Google only), $50/yr Dev plan for 50K
  • Tavily: 1K free/mo, $30/mo Researcher plan
  • Scavio: 250 free/mo, $30/mo for 7K credits ($0.005/credit)
  • Exa: 1K free/mo, $5/1K searches

Minimum-change migration

Replace the Google CSE HTTP call with a structured SERP API. The response format is similar -- both return title, link, snippet arrays. Here is a drop-in replacement:

Python
import os, requests

# Before: Google CSE
def google_cse_search(query):
    resp = requests.get("https://www.googleapis.com/customsearch/v1", params={
        "key": os.environ["GOOGLE_API_KEY"],
        "cx": os.environ["GOOGLE_CX"],
        "q": query,
    })
    items = resp.json().get("items", [])
    return [{"title": i["title"], "link": i["link"],
             "snippet": i["snippet"]} for i in items]

# After: Scavio SERP API (same output shape)
def scavio_search(query):
    resp = requests.post(
        "https://api.scavio.dev/api/v1/search",
        headers={"x-api-key": os.environ["SCAVIO_API_KEY"]},
        json={"query": query, "num_results": 10},
    )
    results = resp.json().get("organic_results", [])
    return [{"title": r["title"], "link": r["link"],
             "snippet": r["snippet"]} for r in results]

What you gain in migration

Google CSE returned basic organic results only. Structured SERP APIs also return AI Overviews, People Also Ask, knowledge panels, local packs, shopping results, and image results -- all parsed into JSON fields. If your application was working around CSE limitations, migration actually unlocks more data.

Python
# Get AI Overview + PAA + organic in one call
resp = requests.post(
    "https://api.scavio.dev/api/v1/search",
    headers={"x-api-key": os.environ["SCAVIO_API_KEY"]},
    json={
        "query": "best project management tools 2026",
        "num_results": 10,
        "include_ai_overview": True,
    },
)
data = resp.json()

organic = data.get("organic_results", [])
ai_overview = data.get("ai_overview", {})
paa = data.get("people_also_ask", [])

print(f"Organic results: {len(organic)}")
print(f"AI Overview present: {bool(ai_overview)}")
print(f"PAA questions: {len(paa)}")

Timeline and action items

Google has not given an exact cutoff date beyond "January 2027." Do not wait for the deadline. Migrate now while you can test both backends in parallel and verify output parity. Start with a feature flag that routes a percentage of traffic to the new API, then cut over once you have confirmed result quality.