Workflow

Competitor Brand Monitoring Email Digest

Monitor competitor brand mentions across Google SERP daily and receive an email digest with position changes, new content, and Reddit mentions.

Overview

This workflow monitors competitor brand keywords across Google SERP every day. It tracks organic position changes, detects new content from competitors (blog posts, landing pages, press releases), and includes Reddit mention counts. The digest email arrives before your morning standup with actionable insights.

Trigger

Cron schedule (daily at 6:30 AM UTC)

Schedule

Daily at 6:30 AM UTC

Workflow Steps

1

Load competitor brand keywords

Read competitor names and branded keywords from config. Include variations like 'CompetitorName review', 'CompetitorName pricing', 'CompetitorName vs'.

2

Search Google for each keyword

Call Scavio Google search for each branded keyword. Record the top 10 organic positions with titles and URLs.

3

Diff against yesterday

Compare today's top 10 against yesterday's stored data. Identify new URLs (new content), position changes, and disappeared results.

4

Check Reddit mentions

Search Reddit for each competitor name via Scavio. Count threads from the last 24 hours to gauge buzz.

5

Email digest

Format all changes into a concise email digest grouped by competitor. Highlight new content and significant position changes.

Python Implementation

Python
import requests, os, json
from pathlib import Path

SCAVIO_KEY = os.environ["SCAVIO_API_KEY"]
H = {"x-api-key": SCAVIO_KEY}
COMPETITORS = {"Tavily": ["tavily pricing", "tavily review", "tavily api"],
               "Serper": ["serper pricing", "serper review", "serper api"]}

def search_keyword(keyword: str) -> list:
    resp = requests.post("https://api.scavio.dev/api/v1/search", headers=H,
        json={"platform": "google", "query": keyword}, timeout=10)
    return [{"pos": r["position"], "title": r["title"], "url": r["link"]}
            for r in resp.json().get("organic", [])[:10]]

baseline_path = Path("competitor_baseline.json")
baseline = json.loads(baseline_path.read_text()) if baseline_path.exists() else {}
today = {}
changes = []

for comp, keywords in COMPETITORS.items():
    today[comp] = {}
    for kw in keywords:
        results = search_keyword(kw)
        today[comp][kw] = results
        old = baseline.get(comp, {}).get(kw, [])
        old_urls = {r["url"] for r in old}
        new_urls = {r["url"] for r in results}
        new_content = new_urls - old_urls
        if new_content:
            changes.append(f"{comp} [{kw}]: {len(new_content)} new URLs")

baseline_path.write_text(json.dumps(today))
for c in changes:
    print(c)

JavaScript Implementation

JavaScript
import { readFileSync, writeFileSync, existsSync } from "fs";

const COMPETITORS = {
  Tavily: ["tavily pricing", "tavily review", "tavily api"],
  Serper: ["serper pricing", "serper review", "serper api"],
};

async function searchKeyword(keyword) {
  const resp = await fetch("https://api.scavio.dev/api/v1/search", {
    method: "POST",
    headers: { "x-api-key": process.env.SCAVIO_API_KEY, "Content-Type": "application/json" },
    body: JSON.stringify({ platform: "google", query: keyword })
  });
  return ((await resp.json()).organic || []).slice(0, 10).map(r => ({
    pos: r.position, title: r.title, url: r.link
  }));
}

const baselinePath = "competitor_baseline.json";
const baseline = existsSync(baselinePath) ? JSON.parse(readFileSync(baselinePath, "utf8")) : {};
const today = {};
const changes = [];

for (const [comp, keywords] of Object.entries(COMPETITORS)) {
  today[comp] = {};
  for (const kw of keywords) {
    const results = await searchKeyword(kw);
    today[comp][kw] = results;
    const oldUrls = new Set((baseline[comp]?.[kw] || []).map(r => r.url));
    const newUrls = results.filter(r => !oldUrls.has(r.url));
    if (newUrls.length > 0) changes.push(`${comp} [${kw}]: ${newUrls.length} new URLs`);
  }
}
writeFileSync(baselinePath, JSON.stringify(today, null, 2));
changes.forEach(c => console.log(c));

Platforms Used

Google

Web search with knowledge graph, PAA, and AI overviews

Reddit

Community, posts & threaded comments from any subreddit

Frequently Asked Questions

This workflow monitors competitor brand keywords across Google SERP every day. It tracks organic position changes, detects new content from competitors (blog posts, landing pages, press releases), and includes Reddit mention counts. The digest email arrives before your morning standup with actionable insights.

This workflow uses a cron schedule (daily at 6:30 am utc). Daily at 6:30 AM UTC.

This workflow uses the following Scavio platforms: google, reddit. Each platform is called via the same unified API endpoint.

Yes. Scavio's free tier includes 500 credits per month with no credit card required. That is enough to test and validate this workflow before scaling it.

Competitor Brand Monitoring Email Digest

Monitor competitor brand mentions across Google SERP daily and receive an email digest with position changes, new content, and Reddit mentions.