Tutorial

How to Build a Multi-Source Trading Brief Agent

Build an AI agent that aggregates SERP, Reddit, and YouTube into a daily ticker brief. Replaces 12 tabs with one typed JSON brief.

Independent traders drown in data. The right agent runs SERP, Reddit, and YouTube in parallel per ticker and returns a typed brief. This tutorial wires Scavio plus Claude into a daily brief agent for a configurable watchlist.

Prerequisites

  • Python 3.10+
  • Scavio API key
  • Anthropic API key

Walkthrough

Step 1: Define watchlist

List of tickers to brief daily.

Python
WATCHLIST = ['AAPL', 'NVDA', 'TSLA', 'AMZN', 'GOOG']

Step 2: Parallel multi-surface fetch

SERP, Reddit, YouTube concurrently per ticker.

Python
import requests, os, asyncio, aiohttp
API_KEY = os.environ['SCAVIO_API_KEY']

async def brief(session, ticker):
    headers = {'x-api-key': API_KEY}
    base = 'https://api.scavio.dev/api/v1/'
    tasks = [
        session.post(base+'google', headers=headers, json={'query': f'{ticker} earnings 2026'}),
        session.post(base+'reddit/search', headers=headers, json={'query': ticker}),
        session.post(base+'youtube/search', headers=headers, json={'query': f'{ticker} earnings call'})
    ]
    serp, rdt, yt = await asyncio.gather(*tasks)
    return {'serp': await serp.json(), 'reddit': await rdt.json(), 'youtube': await yt.json()}

Step 3: Compose markdown brief

Claude turns raw data into 200-word brief.

Python
import anthropic
client = anthropic.Anthropic()

def compose(ticker, data):
    msg = client.messages.create(
        model='claude-sonnet-4-6', max_tokens=400,
        messages=[{'role':'user','content':f'Write 200-word trading brief on {ticker} from: {str(data)[:6000]}'}])
    return msg.content[0].text

Step 4: Email digest

Aggregate all tickers into morning email.

Python
def daily_email(briefs):
    body = '\n\n---\n\n'.join(f'## {t}\n{b}' for t,b in briefs.items())
    # SMTP wiring left to reader
    return body

Step 5: Schedule

Cron at 7 AM market-time.

Bash
# crontab -e
# 0 7 * * 1-5 /usr/bin/python /path/to/trading_brief.py

Python Example

Python
import os, requests
API_KEY = os.environ['SCAVIO_API_KEY']

def brief(ticker):
    serp = requests.post('https://api.scavio.dev/api/v1/google',
        headers={'x-api-key': API_KEY}, json={'query': f'{ticker} earnings 2026'}).json()
    rdt = requests.post('https://api.scavio.dev/api/v1/reddit/search',
        headers={'x-api-key': API_KEY}, json={'query': ticker}).json()
    yt = requests.post('https://api.scavio.dev/api/v1/youtube/search',
        headers={'x-api-key': API_KEY}, json={'query': f'{ticker} earnings call'}).json()
    return {'serp': serp.get('organic_results',[])[:5], 'reddit': rdt.get('posts',[])[:5], 'youtube': yt.get('videos',[])[:5]}

print(brief('NVDA'))

JavaScript Example

JavaScript
const API_KEY = process.env.SCAVIO_API_KEY;
export async function brief(ticker) {
  const headers = { 'x-api-key': API_KEY, 'Content-Type': 'application/json' };
  const base = 'https://api.scavio.dev/api/v1/';
  const [serp, rdt, yt] = await Promise.all([
    fetch(base+'google', { method:'POST', headers, body: JSON.stringify({ query: `${ticker} earnings 2026` }) }).then(r => r.json()),
    fetch(base+'reddit/search', { method:'POST', headers, body: JSON.stringify({ query: ticker }) }).then(r => r.json()),
    fetch(base+'youtube/search', { method:'POST', headers, body: JSON.stringify({ query: `${ticker} earnings call` }) }).then(r => r.json())
  ]);
  return { serp, rdt, yt };
}

Expected Output

JSON
Daily 7 AM email with 200-word brief per ticker on the watchlist. Each brief weaves SERP, Reddit, and YouTube context into a coherent narrative.

Related Tutorials

Frequently Asked Questions

Most developers complete this tutorial in 15 to 30 minutes. You will need a Scavio API key (free tier works) and a working Python or JavaScript environment.

Python 3.10+. Scavio API key. Anthropic API key. A Scavio API key gives you 500 free credits per month.

Yes. The free tier includes 500 credits per month, which is more than enough to complete this tutorial and prototype a working solution.

Scavio has a native LangChain package (langchain-scavio), an MCP server, and a plain REST API that works with any HTTP client. This tutorial uses the raw REST API, but you can adapt to your framework of choice.

Start Building

Build an AI agent that aggregates SERP, Reddit, and YouTube into a daily ticker brief. Replaces 12 tabs with one typed JSON brief.