Workflow

Daily Lead Enrichment via MCP Pipeline

Enrich new leads daily with company data from Google and Reddit using Scavio MCP. Replace multi-vendor enrichment stacks.

Overview

New leads enter your CRM daily with just a name and email. This workflow enriches each new lead automatically by searching Google for company information and Reddit for sentiment and employee discussions. It replaces multi-vendor enrichment stacks (Clearbit, Phantombuster, Google Custom Search) with a single Scavio MCP tool that the agent calls dynamically based on what data is missing.

Trigger

Cron schedule (daily at 10 AM UTC) or CRM webhook on new lead

Schedule

Daily at 10 AM UTC

Workflow Steps

1

Fetch unenriched leads

Query your CRM for leads added in the last 24 hours that lack company data (industry, size, tech stack).

2

Search Google for company info

For each lead's company, query Scavio Google search for firmographics, recent news, and tech stack signals.

3

Search Reddit for sentiment

Query Reddit for the company name to find employee reviews, product discussions, and pain points.

4

Parse and structure data

Extract industry, approximate size, technology mentions, and sentiment from search results.

5

Update CRM records

Write enrichment data back to each lead's CRM record with source URLs for verification.

6

Flag high-priority leads

Score leads by enrichment signals (growing company, active hiring, expressing pain points) and flag top prospects.

Python Implementation

Python
import requests, os, json

H = {"x-api-key": os.environ["SCAVIO_API_KEY"]}

def enrich_lead(company, domain):
    google = requests.post("https://api.scavio.dev/api/v1/search", headers=H,
        json={"platform": "google", "query": f"{company} {domain} company info"}, timeout=10).json()
    reddit = requests.post("https://api.scavio.dev/api/v1/search", headers=H,
        json={"platform": "reddit", "query": f"{company} review"}, timeout=10).json()
    web_data = [{"title": o.get("title"), "snippet": o.get("snippet"),
                 "url": o.get("link")} for o in google.get("organic", [])[:5]]
    reddit_data = [{"title": o.get("title"), "url": o.get("link")}
                   for o in reddit.get("organic", [])[:5]]
    # Extract signals from results
    all_text = " ".join(o.get("snippet", "") for o in google.get("organic", [])[:5]).lower()
    signals = {
        "hiring": "hiring" in all_text or "careers" in all_text,
        "growing": "series" in all_text or "funding" in all_text or "raised" in all_text,
        "has_tech_mentions": any(t in all_text for t in ["api", "saas", "cloud", "aws", "python"])
    }
    return {
        "company": company, "domain": domain,
        "web_results": web_data, "reddit_mentions": reddit_data,
        "signals": signals, "priority": "high" if signals["growing"] else "standard"
    }

leads = [{"company": "Acme Corp", "domain": "acme.com"}]
for lead in leads:
    enriched = enrich_lead(lead["company"], lead["domain"])
    print(json.dumps(enriched, indent=2))

JavaScript Implementation

JavaScript
const H = {"x-api-key": process.env.SCAVIO_API_KEY, "Content-Type": "application/json"};

async function enrichLead(company, domain) {
  const [google, reddit] = await Promise.all([
    fetch("https://api.scavio.dev/api/v1/search", {
      method: "POST", headers: H,
      body: JSON.stringify({platform: "google", query: company + " " + domain + " company info"})
    }).then(r => r.json()),
    fetch("https://api.scavio.dev/api/v1/search", {
      method: "POST", headers: H,
      body: JSON.stringify({platform: "reddit", query: company + " review"})
    }).then(r => r.json())
  ]);
  return {
    company, domain,
    webResults: (google.organic || []).slice(0, 5).map(o => ({title: o.title, snippet: o.snippet, url: o.link})),
    redditMentions: (reddit.organic || []).slice(0, 5).map(o => ({title: o.title, url: o.link}))
  };
}

Platforms Used

Google

Web search with knowledge graph, PAA, and AI overviews

Reddit

Community, posts & threaded comments from any subreddit

Frequently Asked Questions

New leads enter your CRM daily with just a name and email. This workflow enriches each new lead automatically by searching Google for company information and Reddit for sentiment and employee discussions. It replaces multi-vendor enrichment stacks (Clearbit, Phantombuster, Google Custom Search) with a single Scavio MCP tool that the agent calls dynamically based on what data is missing.

This workflow uses a cron schedule (daily at 10 am utc) or crm webhook on new lead. Daily at 10 AM UTC.

This workflow uses the following Scavio platforms: google, reddit. Each platform is called via the same unified API endpoint.

Yes. Scavio's free tier includes 500 credits per month with no credit card required. That is enough to test and validate this workflow before scaling it.

Daily Lead Enrichment via MCP Pipeline

Enrich new leads daily with company data from Google and Reddit using Scavio MCP. Replace multi-vendor enrichment stacks.