Overview
This workflow runs every Sunday evening and scans Reddit for demand signals relevant to side project ideas. It searches for threads where people describe unmet needs, request tools, or complain about existing solutions. Results are ranked by engagement and grouped by problem domain. The output is a weekly report of the top 10 demand signals with engagement scores and source URLs.
Trigger
Cron schedule (weekly, Sunday at 8 PM UTC)
Schedule
Weekly (Sunday at 8 PM UTC)
Workflow Steps
Define search queries
Maintain a list of demand-signal queries: 'I wish there was', 'anyone know a tool for', 'built a tool that', 'looking for alternative to'. Each captures a different demand pattern.
Search Reddit via Scavio
For each query, call Scavio Reddit search with sort by new. Collect threads with titles, scores, comment counts, and subreddit names.
Filter and rank
Filter to threads from the last 7 days. Rank by engagement score: upvotes + 2 * comments. Group by subreddit to identify which communities have the strongest demand.
Deduplicate similar threads
Merge threads about the same topic (fuzzy title matching) into a single demand signal with combined engagement scores.
Generate weekly report
Output the top 10 demand signals as a markdown report with problem description, engagement score, top thread URLs, and subreddit distribution.
Python Implementation
import requests, os, json
SCAVIO_KEY = os.environ["SCAVIO_API_KEY"]
H = {"x-api-key": SCAVIO_KEY}
DEMAND_QUERIES = [
"I wish there was a tool for",
"anyone know a tool for",
"looking for alternative to",
"need a better way to",
"built a tool that",
]
def search_reddit(query: str) -> list:
resp = requests.post("https://api.scavio.dev/api/v1/search", headers=H,
json={"platform": "reddit", "query": query, "sort": "new"}, timeout=10)
threads = resp.json().get("organic", [])
return [{"title": t["title"], "score": t.get("score", 0),
"comments": t.get("comments", 0),
"engagement": t.get("score", 0) + 2 * t.get("comments", 0),
"url": t.get("link", ""), "query": query}
for t in threads]
all_signals = []
for q in DEMAND_QUERIES:
all_signals.extend(search_reddit(q))
all_signals.sort(key=lambda x: x["engagement"], reverse=True)
print("=== Weekly Demand Report ===")
for i, s in enumerate(all_signals[:10], 1):
print(f"{i}. [{s['engagement']}] {s['title']}")
print(f" Source: {s['url']}")
print(f" Query: {s['query']}\n")JavaScript Implementation
const DEMAND_QUERIES = [
"I wish there was a tool for",
"anyone know a tool for",
"looking for alternative to",
"need a better way to",
"built a tool that",
];
async function searchReddit(query) {
const resp = await fetch("https://api.scavio.dev/api/v1/search", {
method: "POST",
headers: { "x-api-key": process.env.SCAVIO_API_KEY, "Content-Type": "application/json" },
body: JSON.stringify({ platform: "reddit", query, sort: "new" })
});
const threads = (await resp.json()).organic || [];
return threads.map(t => ({
title: t.title, score: t.score || 0, comments: t.comments || 0,
engagement: (t.score || 0) + 2 * (t.comments || 0),
url: t.link || "", query
}));
}
const allSignals = [];
for (const q of DEMAND_QUERIES) {
allSignals.push(...await searchReddit(q));
}
allSignals.sort((a, b) => b.engagement - a.engagement);
console.log("=== Weekly Demand Report ===");
allSignals.slice(0, 10).forEach((s, i) => {
console.log(`${i + 1}. [${s.engagement}] ${s.title}`);
console.log(` Source: ${s.url}`);
console.log(` Query: ${s.query}\n`);
});Platforms Used
Community, posts & threaded comments from any subreddit