Tutorial

How to Build a Make.com SEO Pipeline with Search API

Automate SEO monitoring in Make.com with search API integration. Track rankings, monitor competitors, and detect SERP feature changes on autopilot.

Build an SEO monitoring pipeline in Make.com by connecting the Scavio search API via HTTP modules, scheduling daily SERP checks, extracting ranking data, and routing alerts based on position changes. Make.com provides visual workflow automation that non-developers can use to build SEO monitoring without code. By connecting a search API as an HTTP data source, you get daily ranking snapshots, competitor tracking, and SERP feature detection running on autopilot, without managing scripts or cron jobs.

Prerequisites

  • A Make.com account (Free 1K credits or Core $9/mo)
  • A Scavio API key from scavio.dev
  • Target keywords to monitor
  • Basic Make.com scenario knowledge

Walkthrough

Step 1: Configure the HTTP module

Set up the Make.com HTTP module to call the Scavio API.

Python
import os, requests, json

API_KEY = os.environ['SCAVIO_API_KEY']

# Make.com HTTP Module Configuration:
# Module: HTTP > Make a request
# URL: https://api.scavio.dev/api/v1/search
# Method: POST
# Headers:
#   x-api-key: (your Scavio API key, stored in Make.com connection)
#   Content-Type: application/json
# Body type: Raw
# Content type: JSON
# Request content: {"platform": "google", "query": "{{1.keyword}}"}

# Python equivalent:
def make_search(keyword: str) -> dict:
    resp = requests.post('https://api.scavio.dev/api/v1/search',
        headers={'x-api-key': API_KEY, 'Content-Type': 'application/json'},
        json={'platform': 'google', 'query': keyword}, timeout=15)
    return resp.json()

data = make_search('best CRM 2026')
print(f"Results: {len(data.get('organic_results', []))}")

Step 2: Extract ranking positions

Parse the API response to find your domain's position in the results.

Python
def find_my_position(data: dict, my_domain: str) -> dict:
    results = data.get('organic_results', [])
    for i, r in enumerate(results):
        link = r.get('link', '')
        if my_domain in link:
            return {
                'found': True,
                'position': i + 1,
                'title': r.get('title', ''),
                'url': link,
            }
    return {'found': False, 'position': 0, 'title': '', 'url': ''}

# In Make.com, use a Router module to check:
# Condition: {{2.body.organic_results}} contains your domain

my_domain = 'scavio.dev'
keywords = ['search api', 'serp api alternative', 'google search api']
for kw in keywords:
    data = make_search(kw)
    pos = find_my_position(data, my_domain)
    status = f'#{pos["position"]}' if pos['found'] else 'not found'
    print(f'  {kw}: {status}')

Step 3: Monitor SERP features

Track which SERP features appear for your target keywords.

Python
def check_serp_features(data: dict) -> dict:
    features = {
        'featured_snippet': 'featured_snippet' in data or 'answer_box' in data,
        'people_also_ask': len(data.get('people_also_ask', [])) > 0,
        'knowledge_panel': 'knowledge_graph' in data,
        'ai_overview': 'ai_overview' in data,
    }
    active = [k for k, v in features.items() if v]
    return {'features': features, 'active': active, 'count': len(active)}

for kw in keywords:
    data = make_search(kw)
    features = check_serp_features(data)
    print(f'  {kw}: {features["count"]} features ({', '.join(features["active"])})')

Step 4: Track competitors

Monitor which competitors appear for your target keywords and their positions.

Python
def track_competitors(data: dict, competitor_domains: list) -> list:
    found = []
    results = data.get('organic_results', [])
    for i, r in enumerate(results):
        link = r.get('link', '')
        for domain in competitor_domains:
            if domain in link:
                found.append({
                    'competitor': domain,
                    'position': i + 1,
                    'title': r.get('title', ''),
                })
    return found

competitors = ['serpapi.com', 'scrapingbee.com', 'brightdata.com']
for kw in keywords:
    data = make_search(kw)
    comps = track_competitors(data, competitors)
    print(f'  {kw}:')
    for c in comps:
        print(f'    #{c["position"]}: {c["competitor"]}')

Step 5: Build the complete pipeline

Combine all checks into a single pipeline that runs daily and stores results.

Python
import datetime

def seo_pipeline(keywords: list, my_domain: str, competitors: list) -> dict:
    report = {
        'date': datetime.date.today().isoformat(),
        'keywords': [],
    }
    for kw in keywords:
        data = make_search(kw)
        position = find_my_position(data, my_domain)
        features = check_serp_features(data)
        comp_positions = track_competitors(data, competitors)
        report['keywords'].append({
            'keyword': kw,
            'my_position': position['position'] if position['found'] else None,
            'serp_features': features['active'],
            'competitors': comp_positions,
        })
    # Save report
    with open(f'seo_report_{report["date"]}.json', 'w') as f:
        json.dump(report, f, indent=2)
    # Summary
    ranked = sum(1 for k in report['keywords'] if k['my_position'])
    print(f'Pipeline complete: {ranked}/{len(keywords)} keywords ranked')
    return report

seo_pipeline(keywords, 'scavio.dev', competitors)

Python Example

Python
import requests, os
H = {'x-api-key': os.environ['SCAVIO_API_KEY']}

def seo_check(keyword, my_domain):
    data = requests.post('https://api.scavio.dev/api/v1/search', headers=H,
        json={'platform': 'google', 'query': keyword}).json()
    for i, r in enumerate(data.get('organic_results', [])):
        if my_domain in r.get('link', ''):
            return {'keyword': keyword, 'position': i+1}
    return {'keyword': keyword, 'position': None}

print(seo_check('search api', 'scavio.dev'))

JavaScript Example

JavaScript
const H = {'x-api-key': process.env.SCAVIO_API_KEY, 'Content-Type': 'application/json'};
async function seoCheck(keyword, myDomain) {
  const r = await fetch('https://api.scavio.dev/api/v1/search', {
    method: 'POST', headers: H,
    body: JSON.stringify({platform: 'google', query: keyword})
  });
  const results = (await r.json()).organic_results || [];
  const idx = results.findIndex(r => (r.link || '').includes(myDomain));
  return {keyword, position: idx >= 0 ? idx + 1 : null};
}
seoCheck('search api', 'scavio.dev').then(console.log);

Expected Output

JSON
A Make.com-ready SEO pipeline that tracks keyword rankings, monitors SERP features, and detects competitor position changes on a daily automated schedule.

Related Tutorials

Frequently Asked Questions

Most developers complete this tutorial in 15 to 30 minutes. You will need a Scavio API key (free tier works) and a working Python or JavaScript environment.

A Make.com account (Free 1K credits or Core $9/mo). A Scavio API key from scavio.dev. Target keywords to monitor. Basic Make.com scenario knowledge. A Scavio API key gives you 250 free credits per month.

Yes. The free tier includes 250 credits per month, which is more than enough to complete this tutorial and prototype a working solution.

Scavio has a native LangChain package (langchain-scavio), an MCP server, and a plain REST API that works with any HTTP client. This tutorial uses the raw REST API, but you can adapt to your framework of choice.

Start Building

Automate SEO monitoring in Make.com with search API integration. Track rankings, monitor competitors, and detect SERP feature changes on autopilot.