An r/n8n thread mentioned the OP was using 'Google Custom Search plus manual scraping' and wanted a single API. This tutorial walks the replacement path.
Prerequisites
- Python 3.10+
- Scavio API key
Walkthrough
Step 1: Identify Google Custom Search calls
Usually with a CSE key + cx ID.
# Before:
# r = requests.get('https://www.googleapis.com/customsearch/v1', params={'key': KEY, 'cx': CX, 'q': q})Step 2: Replace with Scavio
No CX needed; Scavio searches the open web.
# After:
r = requests.post('https://api.scavio.dev/api/v1/search',
headers={'x-api-key': SCAVIO_API_KEY},
json={'query': q}).json()Step 3: Map response
items[] becomes organic_results[].
# Google CSE: r['items'][i]['link']
# Scavio: r['organic_results'][i]['link']Step 4: Add extract endpoint for content
Replaces the 'manual scraping' half of the OP's flow.
def fetch(url):
return requests.post('https://api.scavio.dev/api/v1/extract',
headers={'x-api-key': SCAVIO_API_KEY}, json={'url': url, 'format': 'markdown'}).json().get('markdown', '')Step 5: Compare quotas
Google CSE caps at 100/day on free, $5/1K above. Scavio free is 500 credits/mo + $30/mo for 7K.
// Daily research agent making 50 queries: Google CSE = $7.50/mo above quota; Scavio = $0 above the 500 free tier or $30/mo flat.Python Example
# Migration takes ~20 minutes for a typical agent.JavaScript Example
// Same in TS.Expected Output
Same query intent, structured JSON, plus extract endpoint that replaces 'manual scraping' under the same key. No more two-vendor split.