Workflow

AI-Native Cybersecurity News Pipeline Workflow

6-burst-per-day cybersecurity news pipeline. 9 sources, similarity-filter dedup, AI editor, daily 11:30 PM recap.

Overview

Multi-source AI-native cybersecurity news pipeline. 6 cron triggers per day pull from 9 sources via SERP-scoped queries plus Reddit. Similarity filter dedupes. LLM edits with editorial angle. Throttled publication. Daily recap at 11:30 PM.

Trigger

6× daily (6 AM, 10 AM, 12 PM, 3 PM, 6 PM, 9 PM)

Schedule

6× daily

Workflow Steps

1

Cron triggers per burst

6 evenly spread firings per day.

2

LLM generates 6 fresh queries per burst

Avoid topics covered earlier today.

3

Parallel SERP across 9 sources

site: queries to TheHackerNews, BleepingComputer, KrebsOnSecurity, etc.

4

Reddit endpoint for community-flagged stories

r/cybersecurity, r/netsec.

5

Similarity-filter dedup

Embed titles, drop duplicates above 0.85 cosine similarity.

6

LLM edit with editorial angle

Per item, 300-word article with internal links.

7

Throttled publication cadence

5-minute spacing between published items.

8

Daily 11:30 PM recap

Aggregate the day's published items.

Python Implementation

Python
import os, requests
API_KEY = os.environ['SCAVIO_API_KEY']
H = {'x-api-key': API_KEY}
SOURCES = ['site:thehackernews.com 2026', 'site:bleepingcomputer.com 2026', 'site:krebsonsecurity.com']

def pull_burst():
    items = []
    for q in SOURCES:
        r = requests.post('https://api.scavio.dev/api/v1/search', headers=H, json={'query': q, 'search_type': 'news'}).json()
        items += r.get('news_results', [])
    return items

JavaScript Implementation

JavaScript
const H = { 'x-api-key': process.env.SCAVIO_API_KEY, 'Content-Type': 'application/json' };
const SOURCES = ['site:thehackernews.com 2026', 'site:bleepingcomputer.com 2026'];
async function burst() {
  const items = [];
  for (const q of SOURCES) {
    const r = await fetch('https://api.scavio.dev/api/v1/search', { method: 'POST', headers: H, body: JSON.stringify({ query: q, search_type: 'news' }) }).then(r => r.json());
    items.push(...(r.news_results || []));
  }
  return items;
}

Platforms Used

Google

Web search with knowledge graph, PAA, and AI overviews

Reddit

Community, posts & threaded comments from any subreddit

Frequently Asked Questions

Multi-source AI-native cybersecurity news pipeline. 6 cron triggers per day pull from 9 sources via SERP-scoped queries plus Reddit. Similarity filter dedupes. LLM edits with editorial angle. Throttled publication. Daily recap at 11:30 PM.

This workflow uses a 6× daily (6 am, 10 am, 12 pm, 3 pm, 6 pm, 9 pm). 6× daily.

This workflow uses the following Scavio platforms: google, reddit. Each platform is called via the same unified API endpoint.

Yes. Scavio's free tier includes 500 credits per month with no credit card required. That is enough to test and validate this workflow before scaling it.

AI-Native Cybersecurity News Pipeline Workflow

6-burst-per-day cybersecurity news pipeline. 9 sources, similarity-filter dedup, AI editor, daily 11:30 PM recap.