Add live web data to any AI automation system by connecting the Scavio search API as an HTTP step in your workflow. Most automation platforms (n8n, Make, Zapier, Pipedream) support HTTP request nodes, making this a universal integration pattern. The search API returns structured JSON that downstream nodes can parse, filter, and route without custom code. This tutorial shows the HTTP configuration that works across all major platforms, then demonstrates specific patterns for enrichment, monitoring, and alert workflows.
Prerequisites
- An automation platform account (n8n, Make, Zapier, or Pipedream)
- A Scavio API key from scavio.dev
- Basic understanding of HTTP request/response flows
- A use case requiring live web data in your automation
Walkthrough
Step 1: Configure the universal HTTP request
Set up the HTTP request that works in any automation platform. The pattern is identical: POST to the Scavio endpoint with your API key in the header.
// Universal HTTP configuration (works in n8n, Make, Zapier, Pipedream):
// Method: POST
// URL: https://api.scavio.dev/api/v1/search
// Headers:
// x-api-key: your_scavio_api_key
// Content-Type: application/json
// Body (JSON):
{
"platform": "google",
"query": "your search query here"
}
// Python equivalent:
import requests, os
resp = requests.post('https://api.scavio.dev/api/v1/search',
headers={'x-api-key': os.environ['SCAVIO_API_KEY']},
json={'platform': 'google', 'query': 'ai automation trends 2026'})
print(resp.json().get('organic_results', [])[:2])Step 2: Parse results for automation nodes
Extract the fields your downstream nodes need from the search response. Most platforms use dot notation or JSONPath.
# Response structure for automation parsing:
# $.organic_results[0].title -> First result title
# $.organic_results[0].link -> First result URL
# $.organic_results[0].snippet -> First result description
# $.ai_overview.text -> AI Overview summary (if present)
# Python helper to format for automation:
def format_for_automation(data: dict) -> list:
return [{
'title': r.get('title', ''),
'url': r.get('link', ''),
'snippet': r.get('snippet', ''),
} for r in data.get('organic_results', [])[:5]]
import requests, os
H = {'x-api-key': os.environ['SCAVIO_API_KEY']}
data = requests.post('https://api.scavio.dev/api/v1/search', headers=H,
json={'platform': 'google', 'query': 'ai automation trends 2026'}).json()
print(format_for_automation(data))Step 3: Build a conditional search trigger
Add logic that only fires the search step when incoming data meets certain criteria, saving API credits.
def should_search(trigger_data: dict) -> bool:
"""Only search if the incoming message contains a question or unknown term."""
text = trigger_data.get('message', '').lower()
question_signals = ['?', 'what is', 'how to', 'latest', 'current', 'price of']
return any(signal in text for signal in question_signals)
def conditional_search(trigger_data: dict) -> dict:
if not should_search(trigger_data):
return {'skipped': True, 'reason': 'no search signal'}
import requests, os
resp = requests.post('https://api.scavio.dev/api/v1/search',
headers={'x-api-key': os.environ['SCAVIO_API_KEY']},
json={'platform': 'google', 'query': trigger_data['message']}, timeout=10)
return {'skipped': False, 'results': resp.json().get('organic_results', [])[:3]}
print(conditional_search({'message': 'What is the latest react version?'}))Step 4: Chain search with downstream actions
Connect the search output to downstream nodes like Slack alerts, email drafts, or database writes.
# Example: Search -> Filter -> Slack alert
def search_and_alert(query: str, alert_keywords: list) -> dict:
import requests, os
resp = requests.post('https://api.scavio.dev/api/v1/search',
headers={'x-api-key': os.environ['SCAVIO_API_KEY']},
json={'platform': 'google', 'query': query}, timeout=10)
results = resp.json().get('organic_results', [])
alerts = []
for r in results:
text = f"{r.get('title', '')} {r.get('snippet', '')}".lower()
matched = [kw for kw in alert_keywords if kw.lower() in text]
if matched:
alerts.append({'title': r['title'], 'url': r.get('link', ''), 'matched': matched})
return {'total_results': len(results), 'alerts': alerts}
print(search_and_alert('scavio api update', ['pricing', 'new feature', 'launch']))Python Example
import requests, os
H = {'x-api-key': os.environ['SCAVIO_API_KEY']}
def automation_search(query, platform='google'):
data = requests.post('https://api.scavio.dev/api/v1/search', headers=H,
json={'platform': platform, 'query': query}).json()
return [{'title': r['title'], 'url': r['link'], 'snippet': r.get('snippet', '')}
for r in data.get('organic_results', [])[:3]]
print(automation_search('ai automation trends 2026'))JavaScript Example
const H = {'x-api-key': process.env.SCAVIO_API_KEY, 'Content-Type': 'application/json'};
async function automationSearch(query, platform = 'google') {
const r = await fetch('https://api.scavio.dev/api/v1/search', {
method: 'POST', headers: H, body: JSON.stringify({platform, query})
});
return ((await r.json()).organic_results || []).slice(0, 3).map(r => ({title: r.title, url: r.link, snippet: r.snippet}));
}
automationSearch('ai automation trends 2026').then(console.log);Expected Output
A reusable HTTP integration pattern that adds live search data to any automation platform, with conditional triggering and downstream action chaining.