Workflow

Daily News Digest Agent

Build an automated multi-source news aggregation agent with Scavio. Search Google News for topics daily and compile a curated digest.

Overview

This workflow acts as a personal news agent. Every morning it searches Google News for a set of topics you care about, deduplicates articles across topics, ranks them by relevance, and delivers a curated digest. It replaces scattered RSS feeds and manual news scanning with a single automated pipeline.

Trigger

Cron schedule (daily at 6 AM UTC)

Schedule

Runs daily at 6 AM UTC

Workflow Steps

1

Define topics

Load the list of news topics and associated queries from configuration (e.g., AI regulation, SaaS funding, search API industry).

2

Search Google News

Call the Scavio API with platform google-news for each topic to retrieve the latest articles.

3

Deduplicate articles

Remove duplicate articles that appear across multiple topic searches by matching on URL.

4

Rank and filter

Score articles by recency and relevance. Keep the top N articles for the digest.

5

Format digest

Generate a structured digest with sections per topic, each containing headline, source, snippet, and link.

6

Deliver

Send the digest via email, Slack, or write to a Notion page.

Python Implementation

Python
import requests
import json
from datetime import datetime

API_KEY = "your_scavio_api_key"
MAX_PER_TOPIC = 5

def search_news(query: str) -> list[dict]:
    res = requests.post(
        "https://api.scavio.dev/api/v1/search",
        headers={"x-api-key": API_KEY},
        json={"platform": "google-news", "query": query},
        timeout=15,
    )
    res.raise_for_status()
    return res.json().get("organic", [])

def run():
    topics = {
        "AI Regulation": "AI regulation policy 2026",
        "SaaS Funding": "SaaS startup funding rounds",
        "Search Industry": "search API industry news",
    }

    seen_urls = set()
    digest = {}

    for label, query in topics.items():
        articles = search_news(query)
        unique = []
        for a in articles:
            url = a.get("link", "")
            if url and url not in seen_urls:
                seen_urls.add(url)
                unique.append({
                    "title": a.get("title", ""),
                    "source": a.get("source", ""),
                    "snippet": a.get("snippet", ""),
                    "link": url,
                })
        digest[label] = unique[:MAX_PER_TOPIC]

    date_str = datetime.utcnow().strftime("%Y-%m-%d")
    print(f"News Digest for {date_str}")
    for topic, articles in digest.items():
        print(f"\n--- {topic} ---")
        for a in articles:
            print(f"  {a['title']} ({a['source']})")
            print(f"  {a['link']}")

if __name__ == "__main__":
    run()

JavaScript Implementation

JavaScript
const API_KEY = "your_scavio_api_key";
const MAX_PER_TOPIC = 5;

async function searchNews(query) {
  const res = await fetch("https://api.scavio.dev/api/v1/search", {
    method: "POST",
    headers: {
      "x-api-key": API_KEY,
      "content-type": "application/json",
    },
    body: JSON.stringify({ platform: "google-news", query }),
  });
  if (!res.ok) throw new Error(`scavio ${res.status}`);
  const data = await res.json();
  return data.organic ?? [];
}

async function run() {
  const topics = {
    "AI Regulation": "AI regulation policy 2026",
    "SaaS Funding": "SaaS startup funding rounds",
    "Search Industry": "search API industry news",
  };

  const seenUrls = new Set();
  const digest = {};

  for (const [label, query] of Object.entries(topics)) {
    const articles = await searchNews(query);
    const unique = [];
    for (const a of articles) {
      if (a.link && !seenUrls.has(a.link)) {
        seenUrls.add(a.link);
        unique.push({
          title: a.title ?? "",
          source: a.source ?? "",
          snippet: a.snippet ?? "",
          link: a.link,
        });
      }
    }
    digest[label] = unique.slice(0, MAX_PER_TOPIC);
  }

  const date = new Date().toISOString().slice(0, 10);
  console.log(`News Digest for ${date}`);
  for (const [topic, articles] of Object.entries(digest)) {
    console.log(`\n--- ${topic} ---`);
    for (const a of articles) {
      console.log(`  ${a.title} (${a.source})`);
      console.log(`  ${a.link}`);
    }
  }
}

run();

Platforms Used

Google News

News search with headlines and sources

Frequently Asked Questions

This workflow acts as a personal news agent. Every morning it searches Google News for a set of topics you care about, deduplicates articles across topics, ranks them by relevance, and delivers a curated digest. It replaces scattered RSS feeds and manual news scanning with a single automated pipeline.

This workflow uses a cron schedule (daily at 6 am utc). Runs daily at 6 AM UTC.

This workflow uses the following Scavio platforms: google-news. Each platform is called via the same unified API endpoint.

Yes. Scavio's free tier includes 500 credits per month with no credit card required. That is enough to test and validate this workflow before scaling it.

Daily News Digest Agent

Build an automated multi-source news aggregation agent with Scavio. Search Google News for topics daily and compile a curated digest.