Overview
This workflow configures DeerFlow research agents to use Scavio as their search backend. Each research step queries live SERP data, verifies claims against current sources, and produces reports with timestamped citations.
Trigger
On-demand when a new research topic is assigned, or scheduled weekly for recurring research briefs.
Schedule
Daily at 8:00 AM
Workflow Steps
Configure DeerFlow Search Tool
Create a search tool function that DeerFlow agents can invoke at any research step. The tool wraps the Scavio API with retry logic and result normalization.
Define Research Steps with Search Queries
Break the research topic into discrete steps, each with a set of search queries to execute. This ensures comprehensive coverage of the topic.
Execute Research with Source Collection
Run each research step, collect all sources, and build a citation index that maps claims to their source URLs.
Generate Cited Research Report
Compile the research findings into a structured report with inline citations linking back to source URLs.
Python Implementation
import requests, os
H = {'x-api-key': os.environ['SCAVIO_API_KEY'], 'Content-Type': 'application/json'}
data = requests.post('https://api.scavio.dev/api/v1/search', headers=H, json={'query': 'example', 'country_code': 'us'}).json()
print(len(data.get('organic_results', [])))JavaScript Implementation
const H = {'x-api-key': process.env.SCAVIO_API_KEY, 'Content-Type': 'application/json'};
fetch('https://api.scavio.dev/api/v1/search', {method: 'POST', headers: H, body: JSON.stringify({query: 'example', country_code: 'us'})}).then(r => r.json()).then(d => console.log(d.organic_results?.length));Platforms Used
Web search with knowledge graph, PAA, and AI overviews
YouTube
Video search with transcripts and metadata
Community, posts & threaded comments from any subreddit