Tutorial

How to Build a Reddit Monitoring Agent

Build a scheduled Reddit monitoring agent in Python that diffs search results over time and alerts on new brand mentions. Uses Scavio Reddit API.

A brand monitoring agent watches a keyword across Reddit and tells you when something new shows up. This tutorial shows how to build one in about 40 lines of Python: search Reddit on a schedule, persist the post ids you have seen, and alert on new ones. You can swap the alert channel for Slack, email, or any webhook.

Prerequisites

  • Python 3.8 or higher
  • requests library installed
  • A Scavio API key
  • A keyword to monitor (brand name, product, competitor)

Walkthrough

Step 1: Define the keyword and state file

Keep a tiny JSON file of post ids you have already seen so repeat runs do not re-alert.

Python
KEYWORD = "your brand name"
STATE_FILE = "reddit_seen.json"

Step 2: Fetch fresh posts

Sort by new to bias toward recent posts. You can also pass a subreddit-scoped query like 'r/startups yourbrand'.

Python
import os, requests

resp = requests.post(
    "https://api.scavio.dev/api/v1/reddit/search",
    headers={"Authorization": f"Bearer {os.environ['SCAVIO_API_KEY']}"},
    json={"query": KEYWORD, "sort": "new"},
    timeout=30,
)
posts = resp.json()["data"]["posts"]

Step 3: Diff against the state file

Load the previously seen post ids, compute the delta, and treat new ids as alerts.

Python
import json, pathlib

state = pathlib.Path(STATE_FILE)
seen = set(json.loads(state.read_text())) if state.exists() else set()

new_posts = [p for p in posts if p["id"] not in seen]
seen.update(p["id"] for p in posts)
state.write_text(json.dumps(list(seen)))

Step 4: Emit an alert

Here we just print, but this is where you would POST to a Slack webhook or enqueue into your job system.

Python
for p in new_posts:
    print(f"NEW mention: r/{p['subreddit']} -- {p['title']}")
    print(f"  {p['url']}")

Python Example

Python
import os, json, pathlib, requests

API_KEY = os.environ["SCAVIO_API_KEY"]
KEYWORD = os.environ.get("MONITOR_KEYWORD", "scavio")
STATE = pathlib.Path("reddit_seen.json")

def fetch(query: str):
    r = requests.post(
        "https://api.scavio.dev/api/v1/reddit/search",
        headers={"Authorization": f"Bearer {API_KEY}"},
        json={"query": query, "sort": "new"},
        timeout=30,
    )
    r.raise_for_status()
    return r.json()["data"]["posts"]

def load_state():
    if STATE.exists():
        return set(json.loads(STATE.read_text()))
    return set()

def save_state(ids):
    STATE.write_text(json.dumps(list(ids)))

def main():
    seen = load_state()
    posts = fetch(KEYWORD)
    new = [p for p in posts if p["id"] not in seen]
    for p in new:
        print(f"NEW r/{p['subreddit']}: {p['title']} ({p['url']})")
    save_state(seen | {p["id"] for p in posts})

if __name__ == "__main__":
    main()

JavaScript Example

JavaScript
import fs from "node:fs";
const API_KEY = process.env.SCAVIO_API_KEY;
const KEYWORD = process.env.MONITOR_KEYWORD ?? "scavio";
const STATE = "reddit_seen.json";

async function fetchPosts(query) {
  const r = await fetch("https://api.scavio.dev/api/v1/reddit/search", {
    method: "POST",
    headers: {
      Authorization: `Bearer ${API_KEY}`,
      "Content-Type": "application/json",
    },
    body: JSON.stringify({ query, sort: "new" }),
  });
  return (await r.json()).data.posts;
}

const seen = new Set(
  fs.existsSync(STATE) ? JSON.parse(fs.readFileSync(STATE, "utf8")) : []
);
const posts = await fetchPosts(KEYWORD);
const fresh = posts.filter((p) => !seen.has(p.id));
for (const p of fresh) {
  console.log(`NEW r/${p.subreddit}: ${p.title}`);
}
fs.writeFileSync(
  STATE,
  JSON.stringify([...seen, ...posts.map((p) => p.id)])
);

Expected Output

JSON
NEW r/SaaS: Has anyone used scavio for Reddit search?
  https://www.reddit.com/r/SaaS/comments/1smxyz1/
NEW r/devtools: scavio vs serpapi for agent builders
  https://www.reddit.com/r/devtools/comments/1smxyz2/

Related Tutorials

Frequently Asked Questions

Most developers complete this tutorial in 15 to 30 minutes. You will need a Scavio API key (free tier works) and a working Python or JavaScript environment.

Python 3.8 or higher. requests library installed. A Scavio API key. A keyword to monitor (brand name, product, competitor). A Scavio API key gives you 500 free credits per month.

Yes. The free tier includes 500 credits per month, which is more than enough to complete this tutorial and prototype a working solution.

Scavio has a native LangChain package (langchain-scavio), an MCP server, and a plain REST API that works with any HTTP client. This tutorial uses the raw REST API, but you can adapt to your framework of choice.

Start Building

Build a scheduled Reddit monitoring agent in Python that diffs search results over time and alerts on new brand mentions. Uses Scavio Reddit API.