Tutorial

How to Build a Daily Content Brief Pipeline

Automate daily content research briefs using the Scavio API. Search trending topics, extract key angles, and deliver briefs to your content team each morning.

Content teams waste hours every morning researching what to write about. This tutorial builds an automated pipeline that runs daily, searches for trending topics and competitor content in your niche, identifies content gaps, and produces structured briefs your writers can execute immediately. The pipeline uses the Scavio API for search at $0.005/query, making a full daily brief with 5 topics cost about $0.050.

Prerequisites

  • Python 3.9+ installed
  • requests library installed
  • A Scavio API key from scavio.dev
  • A defined content niche and competitor list

Walkthrough

Step 1: Define your content niche and seed queries

Set up the topics and competitors to monitor daily. The pipeline will search these to find fresh content opportunities.

Python
import os, requests, json, time
from datetime import datetime

SCAVIO_KEY = os.environ['SCAVIO_API_KEY']
H = {'x-api-key': SCAVIO_KEY, 'Content-Type': 'application/json'}
URL = 'https://api.scavio.dev/api/v1/search'

NICHE = 'developer tools and APIs'
SEED_QUERIES = [
    'developer tools trending 2026',
    'new API launches this week',
    'developer productivity tools',
]
COMPETITOR_DOMAINS = ['competitor1.com', 'competitor2.com']

Step 2: Search for trending content and competitor angles

Run the seed queries and competitor content searches. Extract titles, angles, and People Also Ask data as content opportunities.

Python
def search_trends(queries: list) -> list:
    all_results = []
    for q in queries:
        resp = requests.post(URL, headers=H,
            json={'query': q, 'country_code': 'us', 'num_results': 5})
        data = resp.json()
        results = data.get('organic_results', [])
        paa = data.get('people_also_ask', [])
        all_results.append({
            'query': q,
            'results': [{'title': r['title'], 'url': r['link'], 'snippet': r.get('snippet', '')} for r in results],
            'people_also_ask': [p.get('question', '') for p in paa],
        })
        time.sleep(0.3)
    return all_results

def check_competitors(domains: list) -> list:
    competitor_content = []
    for domain in domains:
        resp = requests.post(URL, headers=H,
            json={'query': f'site:{domain}', 'country_code': 'us', 'num_results': 5})
        results = resp.json().get('organic_results', [])
        competitor_content.extend([{'domain': domain, 'title': r['title'], 'url': r['link']} for r in results])
        time.sleep(0.3)
    return competitor_content

trends = search_trends(SEED_QUERIES)
print(f'Trend data: {sum(len(t["results"]) for t in trends)} results from {len(SEED_QUERIES)} queries')

Step 3: Generate structured content briefs

Analyze the search data to identify the top content opportunities and produce structured briefs with title, angle, keywords, and outline.

Python
def generate_briefs(trends: list, num_briefs: int = 5) -> list:
    # Collect all titles and PAA questions as potential angles
    angles = []
    for t in trends:
        for r in t['results']:
            angles.append({'title': r['title'], 'source': t['query'], 'snippet': r['snippet']})
        for q in t['people_also_ask']:
            angles.append({'title': q, 'source': f'PAA: {t["query"]}', 'snippet': ''})
    # Deduplicate by title similarity (simple approach)
    seen_words = set()
    unique_angles = []
    for a in angles:
        words = frozenset(a['title'].lower().split()[:5])
        if words not in seen_words:
            seen_words.add(words)
            unique_angles.append(a)
    briefs = []
    for a in unique_angles[:num_briefs]:
        brief = {
            'title': a['title'],
            'angle': a['snippet'][:150] if a['snippet'] else 'Based on trending search data',
            'source_query': a['source'],
            'suggested_format': 'tutorial' if 'how' in a['title'].lower() else 'listicle' if any(w in a['title'].lower() for w in ['top', 'best']) else 'analysis',
        }
        briefs.append(brief)
    print(f'Daily Content Brief - {datetime.now().strftime("%Y-%m-%d")}')
    print('=' * 60)
    for i, b in enumerate(briefs, 1):
        print(f'\n{i}. {b["title"]}')
        print(f'   Format: {b["suggested_format"]}')
        print(f'   Angle: {b["angle"][:80]}')
        print(f'   Source: {b["source_query"]}')
    return briefs

briefs = generate_briefs(trends)

Python Example

Python
import os, requests, time
from datetime import datetime

SCAVIO_KEY = os.environ['SCAVIO_API_KEY']
H = {'x-api-key': SCAVIO_KEY, 'Content-Type': 'application/json'}

def daily_brief(niche, num_topics=5):
    queries = [f'{niche} trending 2026', f'{niche} news this week', f'best {niche} tools']
    all_titles = []
    for q in queries:
        resp = requests.post('https://api.scavio.dev/api/v1/search', headers=H,
            json={'query': q, 'country_code': 'us', 'num_results': 5})
        for r in resp.json().get('organic_results', []):
            all_titles.append(r['title'])
        time.sleep(0.3)
    print(f'Content Brief - {datetime.now().strftime("%Y-%m-%d")}')
    for i, title in enumerate(all_titles[:num_topics], 1):
        print(f'  {i}. {title[:60]}')
    print(f'Cost: ${len(queries) * 0.005:.3f}')

daily_brief('developer tools')

JavaScript Example

JavaScript
const SCAVIO_KEY = process.env.SCAVIO_API_KEY;

async function dailyBrief(niche) {
  const queries = [`${niche} trending 2026`, `${niche} news`, `best ${niche}`];
  const titles = [];
  for (const q of queries) {
    const resp = await fetch('https://api.scavio.dev/api/v1/search', {
      method: 'POST',
      headers: { 'x-api-key': SCAVIO_KEY, 'Content-Type': 'application/json' },
      body: JSON.stringify({ query: q, country_code: 'us', num_results: 5 })
    });
    const results = (await resp.json()).organic_results || [];
    titles.push(...results.map(r => r.title));
  }
  console.log(`Content Brief - ${new Date().toISOString().slice(0, 10)}`);
  titles.slice(0, 5).forEach((t, i) => console.log(`  ${i + 1}. ${t.slice(0, 60)}`));
}

dailyBrief('developer tools');

Expected Output

JSON
Daily Content Brief - 2026-05-16
============================================================

1. Top 10 Developer Productivity Tools for 2026
   Format: listicle
   Angle: A roundup of the most impactful developer tools released this year
   Source: developer tools trending 2026

2. How to Build AI-Powered API Testing Pipelines
   Format: tutorial
   Angle: Emerging pattern of using LLMs to generate and run API tests
   Source: new API launches this week

3. MCP Ecosystem: The New Standard for Tool Integration
   Format: analysis
   Angle: Model Context Protocol adoption is accelerating across AI tools
   Source: developer tools trending 2026

Related Tutorials

Frequently Asked Questions

Most developers complete this tutorial in 15 to 30 minutes. You will need a Scavio API key (free tier works) and a working Python or JavaScript environment.

Python 3.9+ installed. requests library installed. A Scavio API key from scavio.dev. A defined content niche and competitor list. A Scavio API key gives you 250 free credits per month.

Yes. The free tier includes 250 credits per month, which is more than enough to complete this tutorial and prototype a working solution.

Scavio has a native LangChain package (langchain-scavio), an MCP server, and a plain REST API that works with any HTTP client. This tutorial uses the raw REST API, but you can adapt to your framework of choice.

Start Building

Automate daily content research briefs using the Scavio API. Search trending topics, extract key angles, and deliver briefs to your content team each morning.