Tutorial

How to Audit Your Site Against Google's GEO Guide

Audit your website against Google's GEO guide. Check AEO-first structure, verify no inauthentic mentions, and test AI Overview presence with SERP data.

Google's Generative Engine Optimization guide published in early 2026 changed the SEO playbook. Sites that follow AEO-first structure, avoid inauthentic authority signals, and provide concise entity-rich answers are favored in AI Overviews. This tutorial builds an automated audit script that checks your pages against the key GEO criteria: structured headings, FAQ schema, concise answer blocks, absence of fabricated statistics, and current AI Overview citation status. The audit uses the Scavio API to check whether your pages already appear in AI Overviews and identifies gaps.

Prerequisites

  • Python 3.9+ installed
  • requests and beautifulsoup4 libraries installed
  • A Scavio API key from scavio.dev
  • A list of target URLs or queries to audit

Walkthrough

Step 1: Define the GEO audit checklist

Create a checklist of GEO compliance signals to test against each page. These map directly to Google's published GEO guide recommendations.

Python
import os, requests
from bs4 import BeautifulSoup

SCAVIO_KEY = os.environ['SCAVIO_API_KEY']
H = {'x-api-key': SCAVIO_KEY, 'Content-Type': 'application/json'}
URL = 'https://api.scavio.dev/api/v1/search'

GEO_CHECKS = [
    'has_faq_schema',
    'has_concise_answer_block',
    'uses_structured_headings',
    'no_fabricated_stats',
    'has_entity_mentions',
    'appears_in_ai_overview',
]

def fetch_page(url: str) -> BeautifulSoup:
    resp = requests.get(url, timeout=10)
    return BeautifulSoup(resp.text, 'html.parser')

Step 2: Check on-page GEO signals

Parse the page HTML and check for FAQ schema, structured headings, concise answer paragraphs, and entity-rich content.

Python
def audit_page_structure(soup: BeautifulSoup) -> dict:
    results = {}
    # FAQ schema check
    scripts = soup.find_all('script', type='application/ld+json')
    faq_found = any('FAQPage' in s.text for s in scripts)
    results['has_faq_schema'] = faq_found
    # Structured headings
    h2s = soup.find_all('h2')
    h3s = soup.find_all('h3')
    results['uses_structured_headings'] = len(h2s) >= 2 and len(h3s) >= 1
    # Concise answer blocks (paragraphs under 300 chars after an h2)
    concise_blocks = 0
    for h2 in h2s:
        next_p = h2.find_next_sibling('p')
        if next_p and len(next_p.get_text()) < 300:
            concise_blocks += 1
    results['has_concise_answer_block'] = concise_blocks >= 1
    # Fabricated stats check
    import re
    text = soup.get_text()
    suspicious = re.findall(r'\d{1,3}% of (?:users|companies|developers|people)', text)
    results['no_fabricated_stats'] = len(suspicious) == 0
    # Entity mentions
    entities = len(re.findall(r'[A-Z][a-z]+ (?:API|SDK|framework|platform|tool)', text))
    results['has_entity_mentions'] = entities >= 2
    return results

Step 3: Check AI Overview presence via SERP

Search for your target query and check if your domain appears in the AI Overview section of the SERP results.

Python
def check_ai_overview(query: str, target_domain: str) -> dict:
    resp = requests.post(URL, headers=H,
        json={'query': query, 'country_code': 'us', 'include_ai_overview': True})
    data = resp.json()
    ai_overview = data.get('ai_overview', {})
    ai_text = ai_overview.get('text', '')
    ai_sources = ai_overview.get('sources', [])
    cited = any(target_domain in s.get('link', '') for s in ai_sources)
    return {
        'appears_in_ai_overview': cited,
        'ai_overview_present': bool(ai_text),
        'ai_sources_count': len(ai_sources),
    }

Step 4: Run the full audit and generate a report

Combine all checks into a single audit function that scores each page and outputs a pass/fail report.

Python
def audit_url(page_url: str, query: str, domain: str) -> dict:
    soup = fetch_page(page_url)
    structure = audit_page_structure(soup)
    ai_check = check_ai_overview(query, domain)
    structure.update(ai_check)
    passed = sum(1 for v in structure.values() if v is True)
    total = len(GEO_CHECKS)
    structure['score'] = f'{passed}/{total}'
    structure['url'] = page_url
    print(f'GEO Audit: {page_url}')
    for check in GEO_CHECKS:
        status = 'PASS' if structure.get(check) else 'FAIL'
        print(f'  [{status}] {check}')
    print(f'  Score: {passed}/{total}')
    return structure

audit_url('https://scavio.dev/tutorials/how-to-fetch-google-search-results',
         'fetch google search results api', 'scavio.dev')

Python Example

Python
import os, requests
from bs4 import BeautifulSoup
import re

SCAVIO_KEY = os.environ['SCAVIO_API_KEY']
H = {'x-api-key': SCAVIO_KEY, 'Content-Type': 'application/json'}

def geo_audit(page_url, query, domain):
    # Fetch page
    soup = BeautifulSoup(requests.get(page_url).text, 'html.parser')
    # On-page checks
    scripts = soup.find_all('script', type='application/ld+json')
    faq = any('FAQPage' in s.text for s in scripts)
    h2s = soup.find_all('h2')
    structured = len(h2s) >= 2
    text = soup.get_text()
    no_fake = len(re.findall(r'\d+% of (?:users|companies)', text)) == 0
    # AI Overview check
    resp = requests.post('https://api.scavio.dev/api/v1/search', headers=H,
        json={'query': query, 'country_code': 'us', 'include_ai_overview': True})
    ai = resp.json().get('ai_overview', {})
    cited = any(domain in s.get('link', '') for s in ai.get('sources', []))
    checks = {'faq_schema': faq, 'structured_headings': structured,
              'no_fabricated_stats': no_fake, 'ai_overview_cited': cited}
    passed = sum(1 for v in checks.values() if v)
    print(f'GEO Audit: {page_url}')
    for k, v in checks.items():
        print(f'  [{"PASS" if v else "FAIL"}] {k}')
    print(f'Score: {passed}/{len(checks)}')

geo_audit('https://scavio.dev', 'scavio search api', 'scavio.dev')

JavaScript Example

JavaScript
const SCAVIO_KEY = process.env.SCAVIO_API_KEY;

async function geoAudit(query, domain) {
  const resp = await fetch('https://api.scavio.dev/api/v1/search', {
    method: 'POST',
    headers: { 'x-api-key': SCAVIO_KEY, 'Content-Type': 'application/json' },
    body: JSON.stringify({ query, country_code: 'us', include_ai_overview: true })
  });
  const data = await resp.json();
  const ai = data.ai_overview || {};
  const sources = ai.sources || [];
  const cited = sources.some(s => (s.link || '').includes(domain));
  const organic = data.organic_results || [];
  const inTop10 = organic.some(r => (r.link || '').includes(domain));
  console.log(`GEO Audit for: ${query}`);
  console.log(`  AI Overview present: ${!!ai.text}`);
  console.log(`  Cited in AI Overview: ${cited}`);
  console.log(`  In organic top 10: ${inTop10}`);
  console.log(`  AI sources: ${sources.length}`);
}

geoAudit('search api for developers', 'scavio.dev');

Expected Output

JSON
GEO Audit: https://scavio.dev/tutorials/how-to-fetch-google-search-results
  [PASS] has_faq_schema
  [PASS] has_concise_answer_block
  [PASS] uses_structured_headings
  [PASS] no_fabricated_stats
  [PASS] has_entity_mentions
  [FAIL] appears_in_ai_overview
  Score: 5/6

Related Tutorials

Frequently Asked Questions

Most developers complete this tutorial in 15 to 30 minutes. You will need a Scavio API key (free tier works) and a working Python or JavaScript environment.

Python 3.9+ installed. requests and beautifulsoup4 libraries installed. A Scavio API key from scavio.dev. A list of target URLs or queries to audit. A Scavio API key gives you 250 free credits per month.

Yes. The free tier includes 250 credits per month, which is more than enough to complete this tutorial and prototype a working solution.

Scavio has a native LangChain package (langchain-scavio), an MCP server, and a plain REST API that works with any HTTP client. This tutorial uses the raw REST API, but you can adapt to your framework of choice.

Start Building

Audit your website against Google's GEO guide. Check AEO-first structure, verify no inauthentic mentions, and test AI Overview presence with SERP data.