groqn8ncompetitor-intel

Competitor Monitoring: Groq + n8n Stack (2026)

Competitor monitoring with Groq Llama 8B for summarization at $0.05/1M tokens, SERP API for data, n8n for orchestration. Under $5/month for daily reports on 5 competitors.

5 min read

A recurring question in r/AiAutomations: how do you monitor competitors daily without spending enterprise-tier money? The answer is a three-piece stack: a search API for SERP snapshots, a cheap LLM for summarization, and a scheduler for automation. Total cost: under $5/month for daily reports on 5 competitors.

The stack

Scavio for daily SERP snapshots: search each competitor's brand name and key product terms. Capture what ranks, what is new, and what changed since yesterday. Groq with Llama 8B for summarization: $0.05 per million input tokens, $0.08 per million output tokens. Summarize 5 SERP snapshots into a daily brief for fractions of a cent. n8n or cron for orchestration: schedule the pipeline to run at 7 AM, results in your inbox before your first coffee.

Step 1: daily SERP snapshots

For each competitor, search their brand name plus key terms. Capture the top 10 results. Compare against yesterday's snapshot to detect new pages, ranking changes, or new content they published.

Python
import requests, os, json
from datetime import datetime

API = 'https://api.scavio.dev/api/v1/search'
H = {'x-api-key': os.environ['SCAVIO_API_KEY']}

COMPETITORS = {
    'competitor_a': ['competitor-a brand', 'competitor-a pricing'],
    'competitor_b': ['competitor-b features', 'competitor-b reviews'],
    'competitor_c': ['competitor-c vs', 'competitor-c alternative'],
}

def daily_snapshot(competitor, queries):
    """Take a SERP snapshot for one competitor."""
    snapshot = {'date': datetime.now().isoformat(), 'competitor': competitor}
    snapshot['results'] = {}
    for query in queries:
        r = requests.post(API, headers=H, json={
            'query': query, 'num_results': 10
        })
        snapshot['results'][query] = r.json().get('results', [])
    return snapshot

# Run for all competitors
today = {}
for comp, queries in COMPETITORS.items():
    today[comp] = daily_snapshot(comp, queries)
    print(f"Captured {comp}: {sum(len(v) for v in today[comp]['results'].values())} results")

Step 2: diff against yesterday

The value is not in today's snapshot alone. It is in the diff. What URLs are new since yesterday? What pages dropped out of the top 10? Did the competitor publish a new blog post, landing page, or changelog entry? Store each day's snapshot as a JSON file and diff the URL sets.

Python
def diff_snapshots(today_snapshot, yesterday_snapshot):
    """Find what changed between two daily snapshots."""
    changes = []
    for query in today_snapshot.get('results', {}):
        today_urls = {r['url'] for r in today_snapshot['results'].get(query, [])
                      if 'url' in r}
        yesterday_urls = {r['url'] for r in yesterday_snapshot.get('results', {}).get(query, [])
                          if 'url' in r}

        new_urls = today_urls - yesterday_urls
        dropped_urls = yesterday_urls - today_urls

        if new_urls or dropped_urls:
            changes.append({
                'query': query,
                'new': list(new_urls),
                'dropped': list(dropped_urls)
            })
    return changes

# changes = diff_snapshots(today['competitor_a'], yesterday['competitor_a'])

Step 3: summarize with Groq

Feed the diffs to Groq's Llama 8B. At $0.05 per million input tokens, summarizing 5 competitor diffs costs less than $0.001 per day. The summary should answer three questions: what did they publish? What rankings changed? What should you pay attention to?

Python
from groq import Groq

groq = Groq(api_key=os.environ['GROQ_API_KEY'])

def summarize_changes(competitor, changes):
    """Summarize competitor changes into a brief."""
    if not changes:
        return f"{competitor}: No changes detected."

    prompt = f"""Summarize these SERP changes for {competitor} in 2-3 sentences.
Focus on: new content published, ranking movements, notable changes.

Changes: {json.dumps(changes, indent=2)}"""

    response = groq.chat.completions.create(
        model='llama-3.1-8b-instant',
        messages=[{'role': 'user', 'content': prompt}],
        max_tokens=200
    )
    return response.choices[0].message.content

# Summarize all competitors into a daily brief
# daily_brief = [summarize_changes(c, diffs) for c, diffs in all_diffs.items()]

Step 4: deliver via email

Send the daily brief to your inbox. In n8n, this is a simple SMTP node at the end of the workflow. With cron, use a Python SMTP library or a transactional email service. The email should be scannable in 30 seconds: one section per competitor, bullet points for changes, nothing else.

Cost breakdown

Scavio: 5 competitors * 2 queries * 30 days = 300 API calls/month. At $0.005/credit: $1.50/month. Well within the 500 free credits. Groq: ~150 summaries/month at ~500 tokens each. Under $0.01/month. n8n: self-hosted on any VPS or free tier on n8n cloud for simple workflows. Email: free with any SMTP provider for low volume. Total: under $5/month, often free if within Scavio's free tier.

What this replaces

SEO tools like Semrush ($139.95/month Pro) or Ahrefs ($29/month Starter) include competitor monitoring but bundle it with features you may not need. If all you want is daily SERP snapshots and change detection, the Scavio + Groq + n8n stack does it for 97% less cost.