Independent traders drown in data. The right agent runs SERP, Reddit, and YouTube in parallel per ticker and returns a typed brief. This tutorial wires Scavio plus Claude into a daily brief agent for a configurable watchlist.
Prerequisites
- Python 3.10+
- Scavio API key
- Anthropic API key
Walkthrough
Step 1: Define watchlist
List of tickers to brief daily.
WATCHLIST = ['AAPL', 'NVDA', 'TSLA', 'AMZN', 'GOOG']Step 2: Parallel multi-surface fetch
SERP, Reddit, YouTube concurrently per ticker.
import requests, os, asyncio, aiohttp
API_KEY = os.environ['SCAVIO_API_KEY']
async def brief(session, ticker):
headers = {'x-api-key': API_KEY}
base = 'https://api.scavio.dev/api/v1/'
tasks = [
session.post(base+'google', headers=headers, json={'query': f'{ticker} earnings 2026'}),
session.post(base+'reddit/search', headers=headers, json={'query': ticker}),
session.post(base+'youtube/search', headers=headers, json={'query': f'{ticker} earnings call'})
]
serp, rdt, yt = await asyncio.gather(*tasks)
return {'serp': await serp.json(), 'reddit': await rdt.json(), 'youtube': await yt.json()}Step 3: Compose markdown brief
Claude turns raw data into 200-word brief.
import anthropic
client = anthropic.Anthropic()
def compose(ticker, data):
msg = client.messages.create(
model='claude-sonnet-4-6', max_tokens=400,
messages=[{'role':'user','content':f'Write 200-word trading brief on {ticker} from: {str(data)[:6000]}'}])
return msg.content[0].textStep 4: Email digest
Aggregate all tickers into morning email.
def daily_email(briefs):
body = '\n\n---\n\n'.join(f'## {t}\n{b}' for t,b in briefs.items())
# SMTP wiring left to reader
return bodyStep 5: Schedule
Cron at 7 AM market-time.
# crontab -e
# 0 7 * * 1-5 /usr/bin/python /path/to/trading_brief.pyPython Example
import os, requests
API_KEY = os.environ['SCAVIO_API_KEY']
def brief(ticker):
serp = requests.post('https://api.scavio.dev/api/v1/google',
headers={'x-api-key': API_KEY}, json={'query': f'{ticker} earnings 2026'}).json()
rdt = requests.post('https://api.scavio.dev/api/v1/reddit/search',
headers={'x-api-key': API_KEY}, json={'query': ticker}).json()
yt = requests.post('https://api.scavio.dev/api/v1/youtube/search',
headers={'x-api-key': API_KEY}, json={'query': f'{ticker} earnings call'}).json()
return {'serp': serp.get('organic_results',[])[:5], 'reddit': rdt.get('posts',[])[:5], 'youtube': yt.get('videos',[])[:5]}
print(brief('NVDA'))JavaScript Example
const API_KEY = process.env.SCAVIO_API_KEY;
export async function brief(ticker) {
const headers = { 'x-api-key': API_KEY, 'Content-Type': 'application/json' };
const base = 'https://api.scavio.dev/api/v1/';
const [serp, rdt, yt] = await Promise.all([
fetch(base+'google', { method:'POST', headers, body: JSON.stringify({ query: `${ticker} earnings 2026` }) }).then(r => r.json()),
fetch(base+'reddit/search', { method:'POST', headers, body: JSON.stringify({ query: ticker }) }).then(r => r.json()),
fetch(base+'youtube/search', { method:'POST', headers, body: JSON.stringify({ query: `${ticker} earnings call` }) }).then(r => r.json())
]);
return { serp, rdt, yt };
}Expected Output
Daily 7 AM email with 200-word brief per ticker on the watchlist. Each brief weaves SERP, Reddit, and YouTube context into a coherent narrative.