An r/AiAutomations post showed building an AI agent for daily competitor reports. This is the highest-value automation you can build because the data is immediately actionable. This tutorial walks the full pipeline.
Prerequisites
- Scavio API key
- LLM API key (Anthropic or OpenAI)
- Email sending (SMTP or SendGrid)
Walkthrough
Step 1: Define competitor queries
List the competitors and keywords to monitor.
competitors = [
{'name': 'CompetitorA', 'queries': ['CompetitorA pricing', 'CompetitorA reviews']},
{'name': 'CompetitorB', 'queries': ['CompetitorB vs alternatives', 'CompetitorB launch']},
]Step 2: Fetch SERP data for each competitor
Run Scavio searches for each query.
import requests, os
H = {'x-api-key': os.environ['SCAVIO_API_KEY']}
def search(query):
return requests.post('https://api.scavio.dev/api/v1/search',
headers=H,
json={'platform': 'google', 'query': query}).json()Step 3: Check Reddit mentions
Search Reddit for competitor discussions.
def reddit_mentions(competitor_name):
return requests.post('https://api.scavio.dev/api/v1/search',
headers=H,
json={'platform': 'reddit', 'query': competitor_name, 'sort': 'new'}).json()Step 4: Generate digest with LLM
Summarize changes and new mentions.
from anthropic import Anthropic
client = Anthropic()
def generate_digest(serp_data, reddit_data):
prompt = f'Summarize competitor intelligence. Focus on pricing changes, new features, and sentiment shifts.\n\nSERP: {serp_data}\nReddit: {reddit_data}'
return client.messages.create(model='claude-sonnet-4-6', max_tokens=500,
messages=[{'role': 'user', 'content': prompt}]).content[0].textStep 5: Send email and schedule
Email the digest and set up a daily cron.
# crontab: 0 8 * * * python competitor_digest.py
# Sends one curated daily report, not real-time alerts
# Quality over quantity — no Slack spamPython Example
import requests, os
from anthropic import Anthropic
H = {'x-api-key': os.environ['SCAVIO_API_KEY']}
client = Anthropic()
def daily_digest(competitors):
all_data = []
for comp in competitors:
for q in comp['queries']:
serp = requests.post('https://api.scavio.dev/api/v1/search', headers=H, json={'platform': 'google', 'query': q}).json()
reddit = requests.post('https://api.scavio.dev/api/v1/search', headers=H, json={'platform': 'reddit', 'query': comp['name']}).json()
all_data.append({'competitor': comp['name'], 'serp': serp, 'reddit': reddit})
return client.messages.create(model='claude-sonnet-4-6', max_tokens=800,
messages=[{'role': 'user', 'content': f'Daily competitor digest:\n{all_data}'}]).content[0].textJavaScript Example
const resp = await fetch('https://api.scavio.dev/api/v1/search', {
method: 'POST', headers: {'x-api-key': process.env.SCAVIO_API_KEY, 'Content-Type': 'application/json'},
body: JSON.stringify({platform: 'google', query: 'CompetitorA pricing 2026'})
});Expected Output
Daily email digest with competitor SERP changes, new Reddit mentions, and LLM-generated analysis. Runs as a cron job.