Tutorial

How to Track Geo Metrics Daily with a Search API

Automate daily geo-targeted SERP tracking. Monitor local rankings, map pack positions, and regional SERP differences with a simple Python cron job.

Track geo-targeted SEO metrics daily by querying the search API with location parameters, storing position data over time, and comparing rankings across regions. Local SEO performance varies dramatically by geography, and national-level rank tracking misses these differences entirely. A daily geo-tracking pipeline captures map pack positions, organic rankings, and SERP feature presence for each target location, building the historical data needed to detect regional ranking changes early.

Prerequisites

  • Python 3.8+ installed
  • requests library installed
  • A Scavio API key from scavio.dev
  • Target keywords and locations defined

Walkthrough

Step 1: Define tracking targets

Configure the keywords and geographic locations to monitor daily.

Python
import os, requests, json, datetime

API_KEY = os.environ['SCAVIO_API_KEY']

TARGETS = [
    {'keyword': 'plumber near me', 'locations': ['New York', 'Los Angeles', 'Chicago']},
    {'keyword': 'best coffee shop', 'locations': ['San Francisco', 'Seattle', 'Portland']},
    {'keyword': 'emergency dentist', 'locations': ['Houston', 'Phoenix', 'Dallas']},
]

HISTORY_FILE = 'geo_metrics_history.json'

print(f'Tracking {len(TARGETS)} keywords across {sum(len(t["locations"]) for t in TARGETS)} locations')

Step 2: Query with geo parameters

Search each keyword with location-specific queries to capture regional rankings.

Python
def search_geo(keyword: str, location: str) -> dict:
    query = f'{keyword} in {location}'
    resp = requests.post('https://api.scavio.dev/api/v1/search',
        headers={'x-api-key': API_KEY},
        json={'platform': 'google', 'query': query}, timeout=15)
    data = resp.json()
    return {
        'keyword': keyword,
        'location': location,
        'date': datetime.date.today().isoformat(),
        'organic_count': len(data.get('organic_results', [])),
        'top_3': [{'title': r.get('title', ''), 'url': r.get('link', ''), 'position': i+1}
                  for i, r in enumerate(data.get('organic_results', [])[:3])],
        'has_map_pack': 'local_results' in data or 'map_pack' in str(data.keys()),
        'has_featured_snippet': 'featured_snippet' in data or 'answer_box' in data,
    }

result = search_geo('plumber near me', 'New York')
print(f"{result['location']}: {result['organic_count']} results, map_pack={result['has_map_pack']}")

Step 3: Run daily scan

Execute the scan across all keyword-location combinations and collect results.

Python
import time

def daily_scan(targets: list) -> list:
    results = []
    for target in targets:
        for location in target['locations']:
            result = search_geo(target['keyword'], location)
            results.append(result)
            print(f"  {result['keyword']} / {result['location']}: {result['organic_count']} results")
            time.sleep(0.3)
    return results

today_results = daily_scan(TARGETS)
print(f'\nScanned {len(today_results)} keyword-location pairs')

Step 4: Store and compare history

Save daily results and compare against previous days to detect ranking changes.

Python
def store_results(results: list):
    history = []
    try:
        with open(HISTORY_FILE) as f:
            history = json.load(f)
    except FileNotFoundError:
        pass
    history.extend(results)
    with open(HISTORY_FILE, 'w') as f:
        json.dump(history, f, indent=2)
    print(f'Stored {len(results)} results (total history: {len(history)})')

def compare_days(keyword: str, location: str) -> dict:
    try:
        with open(HISTORY_FILE) as f:
            history = json.load(f)
    except FileNotFoundError:
        return {}
    entries = [h for h in history if h['keyword'] == keyword and h['location'] == location]
    entries.sort(key=lambda x: x['date'])
    if len(entries) < 2:
        return {'change': 'insufficient data'}
    prev = entries[-2]
    curr = entries[-1]
    return {
        'keyword': keyword,
        'location': location,
        'prev_date': prev['date'],
        'curr_date': curr['date'],
        'organic_change': curr['organic_count'] - prev['organic_count'],
        'map_pack_change': curr['has_map_pack'] != prev['has_map_pack'],
    }

store_results(today_results)

Step 5: Generate geo report

Produce a summary report of ranking performance across all tracked locations.

Python
def geo_report(targets: list) -> str:
    lines = [f'Geo SEO Report - {datetime.date.today().isoformat()}', '']
    for target in targets:
        lines.append(f'Keyword: {target["keyword"]}')
        for location in target['locations']:
            diff = compare_days(target['keyword'], location)
            status = 'NEW' if diff.get('change') == 'insufficient data' else ''
            if not status:
                change = diff.get('organic_change', 0)
                status = f'+{change}' if change > 0 else str(change) if change < 0 else 'stable'
            lines.append(f'  {location}: {status}')
        lines.append('')
    report = '\n'.join(lines)
    print(report)
    return report

geo_report(TARGETS)

Python Example

Python
import requests, os
H = {'x-api-key': os.environ['SCAVIO_API_KEY']}

def geo_rank(keyword, location):
    data = requests.post('https://api.scavio.dev/api/v1/search', headers=H,
        json={'platform': 'google', 'query': f'{keyword} in {location}'}).json()
    top = data.get('organic_results', [])[:3]
    return {'location': location, 'top': [r.get('title', '')[:50] for r in top]}

print(geo_rank('plumber near me', 'New York'))

JavaScript Example

JavaScript
const H = {'x-api-key': process.env.SCAVIO_API_KEY, 'Content-Type': 'application/json'};
async function geoRank(keyword, location) {
  const r = await fetch('https://api.scavio.dev/api/v1/search', {
    method: 'POST', headers: H,
    body: JSON.stringify({platform: 'google', query: `${keyword} in ${location}`})
  });
  const top = ((await r.json()).organic_results || []).slice(0, 3);
  return {location, top: top.map(r => (r.title || '').slice(0, 50))};
}
geoRank('plumber near me', 'New York').then(console.log);

Expected Output

JSON
A daily geo-targeted SERP tracking pipeline that monitors rankings across multiple locations, detects regional changes, and produces summary reports for local SEO optimization.

Related Tutorials

Frequently Asked Questions

Most developers complete this tutorial in 15 to 30 minutes. You will need a Scavio API key (free tier works) and a working Python or JavaScript environment.

Python 3.8+ installed. requests library installed. A Scavio API key from scavio.dev. Target keywords and locations defined. A Scavio API key gives you 250 free credits per month.

Yes. The free tier includes 250 credits per month, which is more than enough to complete this tutorial and prototype a working solution.

Scavio has a native LangChain package (langchain-scavio), an MCP server, and a plain REST API that works with any HTTP client. This tutorial uses the raw REST API, but you can adapt to your framework of choice.

Start Building

Automate daily geo-targeted SERP tracking. Monitor local rankings, map pack positions, and regional SERP differences with a simple Python cron job.