Tutorial

How to Build an SEO Keyword Research Agent for Agencies

Build an AI agent that performs keyword research across Google, Reddit, and YouTube for agency clients using one search API.

SEO agencies run keyword research for dozens of clients. An AI agent can automate the repetitive parts: querying Google for SERP features, checking Reddit for topic demand, and scanning YouTube for content gaps. This tutorial builds an agent that produces a keyword brief from a single seed term.

Prerequisites

  • Python 3.10+
  • A Scavio API key
  • Basic understanding of SEO keyword research

Walkthrough

Step 1: Set up the keyword research function

Query Google for SERP data including People Also Ask questions.

Python
import requests, os

H = {'x-api-key': os.environ['SCAVIO_API_KEY'], 'Content-Type': 'application/json'}
URL = 'https://api.scavio.dev/api/v1/search'

def keyword_serp(keyword: str) -> dict:
    resp = requests.post(URL, headers=H,
        json={'platform': 'google', 'query': keyword}, timeout=10)
    data = resp.json()
    return {
        'organic_count': len(data.get('organic', [])),
        'paa': [q.get('question', '') for q in data.get('people_also_ask', [])],
        'has_ai_overview': bool(data.get('ai_overview')),
        'top_domains': [r.get('link', '').split('/')[2] for r in data.get('organic', [])[:5]],
    }

Step 2: Add Reddit demand signal

Check if the keyword has active Reddit discussions (indicates real user demand).

Python
def reddit_demand(keyword: str) -> dict:
    resp = requests.post(URL, headers=H,
        json={'platform': 'reddit', 'query': keyword}, timeout=10)
    threads = resp.json().get('organic', [])
    return {
        'thread_count': len(threads),
        'top_threads': [{'title': t.get('title',''), 'score': t.get('score',0)} for t in threads[:5]],
        'has_demand': len(threads) >= 3,
    }

Step 3: Check YouTube content gap

Search YouTube to see if video content exists for this keyword.

Python
def youtube_gap(keyword: str) -> dict:
    resp = requests.post(URL, headers=H,
        json={'platform': 'youtube', 'query': keyword}, timeout=10)
    videos = resp.json().get('organic', [])
    return {
        'video_count': len(videos),
        'top_channels': list(set(v.get('channel','') for v in videos[:5] if v.get('channel'))),
        'content_gap': len(videos) < 5,
    }

Step 4: Combine into keyword brief

Run all three checks and produce a structured brief for the client.

Python
def keyword_brief(seed: str) -> dict:
    serp = keyword_serp(seed)
    reddit = reddit_demand(seed)
    youtube = youtube_gap(seed)
    
    priority = 'high' if reddit['has_demand'] and youtube['content_gap'] else \
               'medium' if reddit['has_demand'] or serp['has_ai_overview'] else 'low'
    
    return {
        'keyword': seed,
        'priority': priority,
        'serp_analysis': serp,
        'reddit_demand': reddit,
        'youtube_gap': youtube,
        'paa_opportunities': serp['paa'],
        'recommendation': f"Priority: {priority}. {'Strong Reddit demand + YouTube gap = opportunity.' if priority == 'high' else 'Monitor for changes.'}"
    }

brief = keyword_brief('best crm for startups 2026')
print(f"Priority: {brief['priority']}")
print(f"PAA questions: {len(brief['paa_opportunities'])}")
print(f"Reddit threads: {brief['reddit_demand']['thread_count']}")

Step 5: Scale for multiple keywords

Process a list of seed keywords for a client and output a ranked brief.

Python
def client_keyword_audit(keywords: list, client_name: str) -> dict:
    briefs = [keyword_brief(kw) for kw in keywords]
    briefs.sort(key=lambda b: {'high': 3, 'medium': 2, 'low': 1}[b['priority']], reverse=True)
    
    return {
        'client': client_name,
        'total_keywords': len(briefs),
        'high_priority': [b for b in briefs if b['priority'] == 'high'],
        'medium_priority': [b for b in briefs if b['priority'] == 'medium'],
        'low_priority': [b for b in briefs if b['priority'] == 'low'],
        'total_credits_used': len(keywords) * 3,  # 3 queries per keyword
    }

# Usage:
audit = client_keyword_audit(
    ['best crm 2026', 'crm for startups', 'hubspot alternatives'],
    'Acme Corp'
)
print(f"High priority: {len(audit['high_priority'])} keywords")

Python Example

Python
import requests, os
H = {'x-api-key': os.environ['SCAVIO_API_KEY'], 'Content-Type': 'application/json'}

def quick_brief(keyword):
    serp = requests.post('https://api.scavio.dev/api/v1/search', headers=H,
        json={'platform': 'google', 'query': keyword}).json()
    reddit = requests.post('https://api.scavio.dev/api/v1/search', headers=H,
        json={'platform': 'reddit', 'query': keyword}).json()
    return {'paa': len(serp.get('people_also_ask',[])), 'reddit_threads': len(reddit.get('organic',[]))}

JavaScript Example

JavaScript
async function quickBrief(keyword) {
  const H = {'x-api-key': process.env.SCAVIO_API_KEY, 'Content-Type': 'application/json'};
  const [serp, reddit] = await Promise.all([
    fetch('https://api.scavio.dev/api/v1/search', {method:'POST', headers:H, body:JSON.stringify({platform:'google', query:keyword})}).then(r=>r.json()),
    fetch('https://api.scavio.dev/api/v1/search', {method:'POST', headers:H, body:JSON.stringify({platform:'reddit', query:keyword})}).then(r=>r.json())
  ]);
  return {paa: (serp.people_also_ask||[]).length, redditThreads: (reddit.organic||[]).length};
}

Expected Output

JSON
An SEO keyword research agent that produces priority-ranked keyword briefs combining Google SERP data, Reddit demand signals, and YouTube content gaps.

Related Tutorials

Frequently Asked Questions

Most developers complete this tutorial in 15 to 30 minutes. You will need a Scavio API key (free tier works) and a working Python or JavaScript environment.

Python 3.10+. A Scavio API key. Basic understanding of SEO keyword research. A Scavio API key gives you 500 free credits per month.

Yes. The free tier includes 500 credits per month, which is more than enough to complete this tutorial and prototype a working solution.

Scavio has a native LangChain package (langchain-scavio), an MCP server, and a plain REST API that works with any HTTP client. This tutorial uses the raw REST API, but you can adapt to your framework of choice.

Start Building

Build an AI agent that performs keyword research across Google, Reddit, and YouTube for agency clients using one search API.