Tutorial

How to Combine Surfer SEO with a Search API

Enhance Surfer SEO content workflows with real-time SERP data. Pull competitor content, featured snippets, and People Also Ask for better optimization.

Combine Surfer SEO with a search API to pull real-time SERP data alongside Surfer's content recommendations. Surfer SEO provides content structure and keyword density guidance, but it does not show you the live SERP landscape as you write. By querying the search API for your target keyword, you can see current featured snippets, People Also Ask questions, and competitor titles in real time. This lets you align your content with both Surfer's NLP-based recommendations and the actual search results page.

Prerequisites

  • Python 3.8+ installed
  • requests library installed
  • A Scavio API key from scavio.dev
  • A Surfer SEO account (Essential $99/mo or higher)

Walkthrough

Step 1: Query SERP for target keyword

Pull the current Google SERP for the keyword you are optimizing in Surfer SEO.

Python
import os, requests, json

API_KEY = os.environ['SCAVIO_API_KEY']

def get_serp(keyword: str) -> dict:
    resp = requests.post('https://api.scavio.dev/api/v1/search',
        headers={'x-api-key': API_KEY},
        json={'platform': 'google', 'query': keyword}, timeout=15)
    return resp.json()

serp = get_serp('best project management tools 2026')
print(f"Organic results: {len(serp.get('organic_results', []))}")
print(f"People Also Ask: {len(serp.get('people_also_ask', []))}")

Step 2: Extract competitor content signals

Analyze the top-ranking pages to understand their title patterns, word count signals, and content angles.

Python
def analyze_competitors(serp: dict) -> list:
    competitors = []
    for r in serp.get('organic_results', [])[:10]:
        competitors.append({
            'position': r.get('position', 0),
            'title': r.get('title', ''),
            'url': r.get('link', ''),
            'snippet_length': len(r.get('snippet', '')),
            'has_date': any(str(y) in r.get('title', '') for y in [2025, 2026]),
            'has_number': any(c.isdigit() for c in r.get('title', '')),
        })
    return competitors

comps = analyze_competitors(serp)
for c in comps[:5]:
    print(f"#{c['position']}: {c['title'][:55]} (date={c['has_date']})")

Step 3: Capture People Also Ask

Pull PAA questions to incorporate into your Surfer content as H2 or H3 subheadings.

Python
def get_paa(serp: dict) -> list:
    paa = serp.get('people_also_ask', [])
    questions = []
    for item in paa:
        if isinstance(item, dict):
            questions.append(item.get('question', ''))
        elif isinstance(item, str):
            questions.append(item)
    return questions

questions = get_paa(serp)
print('People Also Ask questions to address in your content:')
for q in questions:
    print(f'  - {q}')

Step 4: Extract featured snippet format

Determine the current featured snippet format so you can structure your content to compete.

Python
def analyze_snippet(serp: dict) -> dict:
    snippet = serp.get('featured_snippet', serp.get('answer_box', {}))
    if not snippet or not isinstance(snippet, dict):
        return {'type': 'none', 'exists': False}
    snippet_text = snippet.get('snippet', snippet.get('answer', ''))
    has_list = '<li>' in str(snippet) or isinstance(snippet.get('list'), list)
    has_table = '<table>' in str(snippet) or isinstance(snippet.get('table'), list)
    return {
        'exists': True,
        'type': 'list' if has_list else 'table' if has_table else 'paragraph',
        'text_length': len(str(snippet_text)),
        'source': snippet.get('link', snippet.get('source', '')),
    }

fs = analyze_snippet(serp)
print(f"Featured snippet: {fs['type']} ({fs.get('text_length', 0)} chars)")

Step 5: Generate content brief

Combine SERP insights with Surfer SEO recommendations into a single content brief.

Python
def generate_brief(keyword: str) -> dict:
    serp = get_serp(keyword)
    comps = analyze_competitors(serp)
    paa = get_paa(serp)
    snippet = analyze_snippet(serp)
    titles_with_dates = sum(1 for c in comps if c['has_date'])
    brief = {
        'keyword': keyword,
        'competitor_count': len(comps),
        'titles_with_year': titles_with_dates,
        'paa_questions': paa,
        'featured_snippet': snippet,
        'top_3_titles': [c['title'] for c in comps[:3]],
        'recommendations': [],
    }
    if titles_with_dates > 3:
        brief['recommendations'].append('Include 2026 in your title')
    if snippet['exists']:
        brief['recommendations'].append(f"Target a {snippet['type']} featured snippet")
    if paa:
        brief['recommendations'].append(f'Address {len(paa)} PAA questions as subheadings')
    return brief

brief = generate_brief('best project management tools 2026')
for rec in brief['recommendations']:
    print(f'  > {rec}')

Python Example

Python
import requests, os
H = {'x-api-key': os.environ['SCAVIO_API_KEY']}

def serp_brief(keyword):
    data = requests.post('https://api.scavio.dev/api/v1/search', headers=H,
        json={'platform': 'google', 'query': keyword}).json()
    titles = [r['title'] for r in data.get('organic_results', [])[:5]]
    paa = [q.get('question', q) if isinstance(q, dict) else q for q in data.get('people_also_ask', [])]
    return {'titles': titles, 'paa': paa}

print(serp_brief('best project management tools 2026'))

JavaScript Example

JavaScript
const H = {'x-api-key': process.env.SCAVIO_API_KEY, 'Content-Type': 'application/json'};
async function serpBrief(keyword) {
  const r = await fetch('https://api.scavio.dev/api/v1/search', {
    method: 'POST', headers: H,
    body: JSON.stringify({platform: 'google', query: keyword})
  });
  const data = await r.json();
  return {
    titles: (data.organic_results || []).slice(0, 5).map(r => r.title),
    paa: (data.people_also_ask || []).map(q => q.question || q)
  };
}
serpBrief('best project management tools 2026').then(console.log);

Expected Output

JSON
A content brief combining Surfer SEO recommendations with live SERP data including competitor titles, PAA questions, and featured snippet analysis for better content optimization.

Related Tutorials

Frequently Asked Questions

Most developers complete this tutorial in 15 to 30 minutes. You will need a Scavio API key (free tier works) and a working Python or JavaScript environment.

Python 3.8+ installed. requests library installed. A Scavio API key from scavio.dev. A Surfer SEO account (Essential $99/mo or higher). A Scavio API key gives you 250 free credits per month.

Yes. The free tier includes 250 credits per month, which is more than enough to complete this tutorial and prototype a working solution.

Scavio has a native LangChain package (langchain-scavio), an MCP server, and a plain REST API that works with any HTTP client. This tutorial uses the raw REST API, but you can adapt to your framework of choice.

Start Building

Enhance Surfer SEO content workflows with real-time SERP data. Pull competitor content, featured snippets, and People Also Ask for better optimization.