Workflow

Daily Competitor Digest via Groq Email

Automate daily competitor monitoring with Scavio SERP data and Groq Llama summarization. Receive a concise email digest every morning.

Overview

This workflow runs daily at 7 AM UTC, queries Scavio for each competitor's brand name across Google and Reddit, sends the raw results to Groq Llama 8B for summarization, and emails the digest to your team. Total daily cost for 5 competitors: ~$0.03 (Scavio) + ~$0.005 (Groq) = ~$0.035.

Trigger

Cron schedule (daily at 7 AM UTC)

Schedule

Daily at 7 AM UTC

Workflow Steps

1

Load competitor list

Read competitor names and tracking keywords from a JSON config file or environment variable.

2

Search Google and Reddit for each competitor

For each competitor, call Scavio Google search and Reddit search with the competitor name. Collect organic results and Reddit threads.

3

Summarize with Groq Llama 8B

Send the combined SERP snippets and Reddit titles to Groq Llama 8B with a summarization prompt. Request 3-5 bullet points per competitor.

4

Format email digest

Compile summaries into an HTML email with competitor sections, bullet points, and source links.

5

Send email

Send the digest via SMTP, SendGrid, or Resend. Include a plain text fallback for email clients that strip HTML.

Python Implementation

Python
import requests, os, smtplib
from email.mime.text import MIMEText

SCAVIO_KEY = os.environ["SCAVIO_API_KEY"]
GROQ_KEY = os.environ["GROQ_API_KEY"]
COMPETITORS = ["Tavily", "Serper", "Exa"]

def search_competitor(name):
    H = {"x-api-key": SCAVIO_KEY}
    serp = requests.post("https://api.scavio.dev/api/v1/search", headers=H,
        json={"platform": "google", "query": f"{name} news 2026"}, timeout=10).json()
    reddit = requests.post("https://api.scavio.dev/api/v1/search", headers=H,
        json={"platform": "reddit", "query": name}, timeout=10).json()
    snippets = [r.get("snippet", "") for r in serp.get("organic", [])[:5]]
    threads = [r.get("title", "") for r in reddit.get("organic", [])[:5]]
    return "\n".join(snippets + threads)

def summarize(text, name):
    resp = requests.post("https://api.groq.com/openai/v1/chat/completions",
        headers={"Authorization": f"Bearer {GROQ_KEY}"},
        json={"model": "llama-3.1-8b-instant", "messages": [
            {"role": "user", "content": f"Summarize {name} updates in 3-5 bullets:\n{text}"}
        ]}).json()
    return resp["choices"][0]["message"]["content"]

digest = []
for comp in COMPETITORS:
    raw = search_competitor(comp)
    summary = summarize(raw, comp)
    digest.append(f"## {comp}\n{summary}")

body = "\n\n".join(digest)
msg = MIMEText(body)
msg["Subject"] = "Daily Competitor Digest"
print(body)

JavaScript Implementation

JavaScript
const COMPETITORS = ["Tavily", "Serper", "Exa"];

async function searchCompetitor(name) {
  const headers = { "x-api-key": process.env.SCAVIO_API_KEY, "Content-Type": "application/json" };
  const [serp, reddit] = await Promise.all([
    fetch("https://api.scavio.dev/api/v1/search", { method: "POST", headers,
      body: JSON.stringify({ platform: "google", query: `${name} news 2026` }) }).then(r => r.json()),
    fetch("https://api.scavio.dev/api/v1/search", { method: "POST", headers,
      body: JSON.stringify({ platform: "reddit", query: name }) }).then(r => r.json())
  ]);
  const snippets = (serp.organic || []).slice(0, 5).map(r => r.snippet || "");
  const threads = (reddit.organic || []).slice(0, 5).map(r => r.title || "");
  return [...snippets, ...threads].join("\n");
}

async function summarize(text, name) {
  const resp = await fetch("https://api.groq.com/openai/v1/chat/completions", {
    method: "POST",
    headers: { Authorization: `Bearer ${process.env.GROQ_API_KEY}`, "Content-Type": "application/json" },
    body: JSON.stringify({ model: "llama-3.1-8b-instant", messages: [
      { role: "user", content: `Summarize ${name} updates in 3-5 bullets:\n${text}` }
    ]})
  }).then(r => r.json());
  return resp.choices[0].message.content;
}

for (const comp of COMPETITORS) {
  const raw = await searchCompetitor(comp);
  const summary = await summarize(raw, comp);
  console.log(`## ${comp}\n${summary}\n`);
}

Platforms Used

Google

Web search with knowledge graph, PAA, and AI overviews

Reddit

Community, posts & threaded comments from any subreddit

Frequently Asked Questions

This workflow runs daily at 7 AM UTC, queries Scavio for each competitor's brand name across Google and Reddit, sends the raw results to Groq Llama 8B for summarization, and emails the digest to your team. Total daily cost for 5 competitors: ~$0.03 (Scavio) + ~$0.005 (Groq) = ~$0.035.

This workflow uses a cron schedule (daily at 7 am utc). Daily at 7 AM UTC.

This workflow uses the following Scavio platforms: google, reddit. Each platform is called via the same unified API endpoint.

Yes. Scavio's free tier includes 500 credits per month with no credit card required. That is enough to test and validate this workflow before scaling it.

Daily Competitor Digest via Groq Email

Automate daily competitor monitoring with Scavio SERP data and Groq Llama summarization. Receive a concise email digest every morning.