Tutorial

How to Build an AI Overview Change Alert System

Monitor Google AI Overview changes daily. Detect new citations, removed citations, and text changes. Automated alerting pipeline with Python.

Build an AI Overview change alert system that establishes a baseline of AI Overview content for your target queries, runs daily searches with AI Overview parsing, diffs the current content against the baseline, and alerts when citations are added, removed, or when the summary text changes significantly. Google's AI Overview is now a primary traffic source for many queries, and citations can appear or disappear without warning. An automated monitoring system catches these changes within 24 hours instead of weeks.

Prerequisites

  • Python 3.8+ installed
  • requests library installed
  • A Scavio API key from scavio.dev
  • A list of target queries to monitor

Walkthrough

Step 1: Baseline queries

Run initial searches to establish a baseline of AI Overview content for each target query.

Python
import os, requests, json, datetime, hashlib

API_KEY = os.environ['SCAVIO_API_KEY']

QUERIES = [
    'best search api for developers',
    'how to add search to ai agent',
    'serp api comparison 2026',
]
BASELINE_FILE = 'ai_overview_baseline.json'

def fetch_ai_overview(query: str) -> dict:
    resp = requests.post('https://api.scavio.dev/api/v1/search',
        headers={'x-api-key': API_KEY},
        json={'platform': 'google', 'query': query}, timeout=15)
    data = resp.json()
    ai = data.get('ai_overview', {})
    if isinstance(ai, dict):
        return {
            'text': ai.get('text', ''),
            'citations': ai.get('citations', []),
        }
    return {'text': str(ai), 'citations': []}

def create_baseline(queries: list) -> dict:
    baseline = {}
    for q in queries:
        overview = fetch_ai_overview(q)
        baseline[q] = {
            'date': datetime.date.today().isoformat(),
            'text_hash': hashlib.md5(overview['text'].encode()).hexdigest(),
            'text': overview['text'][:500],
            'citations': overview['citations'],
        }
        print(f'{q}: {len(overview["text"])} chars, {len(overview["citations"])} citations')
    with open(BASELINE_FILE, 'w') as f:
        json.dump(baseline, f, indent=2)
    return baseline

baseline = create_baseline(QUERIES)

Step 2: Daily search with AI Overview parsing

Run daily searches and parse the current AI Overview content for comparison.

Python
def daily_scan(queries: list) -> dict:
    current = {}
    for q in queries:
        overview = fetch_ai_overview(q)
        current[q] = {
            'date': datetime.date.today().isoformat(),
            'text_hash': hashlib.md5(overview['text'].encode()).hexdigest(),
            'text': overview['text'][:500],
            'citations': overview['citations'],
        }
    return current

current = daily_scan(QUERIES)
for q, data in current.items():
    print(f'{q}: {len(data["citations"])} citations')

Step 3: Diff citations

Compare current citations against baseline to find additions and removals.

Python
def extract_urls(citations: list) -> set:
    urls = set()
    for c in citations:
        if isinstance(c, dict):
            urls.add(c.get('url', c.get('link', '')))
        elif isinstance(c, str):
            urls.add(c)
    return urls

def diff_citations(baseline_entry: dict, current_entry: dict) -> dict:
    base_urls = extract_urls(baseline_entry.get('citations', []))
    curr_urls = extract_urls(current_entry.get('citations', []))
    added = curr_urls - base_urls
    removed = base_urls - curr_urls
    text_changed = baseline_entry.get('text_hash') != current_entry.get('text_hash')
    return {
        'added': list(added),
        'removed': list(removed),
        'text_changed': text_changed,
        'has_changes': bool(added or removed or text_changed),
    }

# Example:
base = {'citations': [{'url': 'https://a.com'}, {'url': 'https://b.com'}], 'text_hash': 'abc'}
curr = {'citations': [{'url': 'https://b.com'}, {'url': 'https://c.com'}], 'text_hash': 'def'}
diff = diff_citations(base, curr)
print(f'Added: {diff["added"]}, Removed: {diff["removed"]}, Text changed: {diff["text_changed"]}')

Step 4: Alert on changes

Generate alerts for any detected changes and print a summary report.

Python
def generate_alerts(baseline: dict, current: dict) -> list:
    alerts = []
    for query in current:
        if query not in baseline:
            continue
        diff = diff_citations(baseline[query], current[query])
        if diff['has_changes']:
            alert = {
                'query': query,
                'date': current[query]['date'],
                'added_citations': diff['added'],
                'removed_citations': diff['removed'],
                'text_changed': diff['text_changed'],
            }
            alerts.append(alert)
            print(f'CHANGE DETECTED: {query}')
            if diff['added']:
                print(f'  Added: {diff["added"]}')
            if diff['removed']:
                print(f'  Removed: {diff["removed"]}')
            if diff['text_changed']:
                print(f'  AI Overview text modified')
    if not alerts:
        print('No changes detected in AI Overviews')
    return alerts

alerts = generate_alerts(baseline, current)

Step 5: Log change history

Maintain a running log of all detected changes for trend analysis over time.

Python
HISTORY_FILE = 'ai_overview_changes.jsonl'

def log_changes(alerts: list):
    if not alerts:
        return
    with open(HISTORY_FILE, 'a') as f:
        for alert in alerts:
            f.write(json.dumps(alert) + '\n')
    print(f'Logged {len(alerts)} changes to {HISTORY_FILE}')

def read_history() -> list:
    entries = []
    try:
        with open(HISTORY_FILE) as f:
            for line in f:
                if line.strip():
                    entries.append(json.loads(line))
    except FileNotFoundError:
        pass
    return entries

# Update baseline after processing
def update_baseline(current: dict):
    with open(BASELINE_FILE, 'w') as f:
        json.dump(current, f, indent=2)
    print('Baseline updated')

log_changes(alerts)
history = read_history()
print(f'Total historical changes: {len(history)}')

Python Example

Python
import requests, os, hashlib
H = {'x-api-key': os.environ['SCAVIO_API_KEY']}

def check_ai_overview(query):
    data = requests.post('https://api.scavio.dev/api/v1/search', headers=H,
        json={'platform': 'google', 'query': query}).json()
    ai = data.get('ai_overview', {})
    text = ai.get('text', '') if isinstance(ai, dict) else str(ai)
    citations = ai.get('citations', []) if isinstance(ai, dict) else []
    return {'hash': hashlib.md5(text.encode()).hexdigest(), 'citations': len(citations)}

print(check_ai_overview('best search api 2026'))

JavaScript Example

JavaScript
const H = {'x-api-key': process.env.SCAVIO_API_KEY, 'Content-Type': 'application/json'};
async function checkAiOverview(query) {
  const r = await fetch('https://api.scavio.dev/api/v1/search', {
    method: 'POST', headers: H, body: JSON.stringify({platform: 'google', query})
  });
  const ai = (await r.json()).ai_overview || {};
  const text = ai.text || '';
  return {length: text.length, citations: (ai.citations || []).length};
}
checkAiOverview('best search api 2026').then(console.log);

Expected Output

JSON
An automated daily system that monitors Google AI Overview changes, detects citation additions/removals, tracks text modifications, and maintains a change history log.

Related Tutorials

Frequently Asked Questions

Most developers complete this tutorial in 15 to 30 minutes. You will need a Scavio API key (free tier works) and a working Python or JavaScript environment.

Python 3.8+ installed. requests library installed. A Scavio API key from scavio.dev. A list of target queries to monitor. A Scavio API key gives you 250 free credits per month.

Yes. The free tier includes 250 credits per month, which is more than enough to complete this tutorial and prototype a working solution.

Scavio has a native LangChain package (langchain-scavio), an MCP server, and a plain REST API that works with any HTTP client. This tutorial uses the raw REST API, but you can adapt to your framework of choice.

Start Building

Monitor Google AI Overview changes daily. Detect new citations, removed citations, and text changes. Automated alerting pipeline with Python.