An r/MarketingandAI thread complained that AI marketing agents 'reorganize the same limitations.' The fix is to separate the data layer from the orchestration layer. This tutorial wires Scavio as a marketing research agent's data layer.
Prerequisites
- Python 3.10+
- Scavio API key
- Anthropic or OpenAI key
Walkthrough
Step 1: Define competitor + topic set
5-10 competitors, 5-15 topics.
COMPETITORS = ['firecrawl', 'tavily', 'serper']
TOPICS = ['mcp server', 'ai agent search', 'web scraping 2026']Step 2: Daily competitor digest
SERP + Reddit + YouTube per competitor.
import requests, os
API_KEY = os.environ['SCAVIO_API_KEY']
H = {'x-api-key': API_KEY}
def competitor_digest(name):
return {
'serp': requests.post('https://api.scavio.dev/api/v1/search', headers=H, json={'query': name}).json(),
'reddit': requests.post('https://api.scavio.dev/api/v1/reddit/search', headers=H, json={'query': name}).json(),
'youtube': requests.post('https://api.scavio.dev/api/v1/youtube/search', headers=H, json={'query': name}).json(),
}Step 3: Topic-level visibility tracking
Where does each competitor show up?
def topic_share(topic):
r = requests.post('https://api.scavio.dev/api/v1/search', headers=H, json={'query': topic, 'include_ai_overview': True}).json()
return rStep 4: LLM brief composition
Claude turns raw data into 200-word brief.
import anthropic
client = anthropic.Anthropic()
def brief(competitor, data):
msg = client.messages.create(model='claude-sonnet-4-6', max_tokens=400,
messages=[{'role':'user','content':f'200-word competitor brief on {competitor}: {str(data)[:6000]}'}])
return msg.content[0].textStep 5: Schedule weekly
n8n cron or simple crontab.
# 0 8 * * 1 /usr/bin/python /path/to/marketing.pyPython Example
# Weekly run: 10 competitors × 3 surfaces + 15 topics = 45 credits = $0.19. Negligible at the Project tier.JavaScript Example
// TS version uses the Anthropic SDK.Expected Output
Weekly digest with per-competitor 200-word brief plus topic-level visibility share. Drops into Slack or email.