Piping search API results through CLI tools lets you build powerful one-liners and shell scripts without writing Python or JavaScript. Using curl to call the Scavio API and jq to parse the JSON response, you can filter, sort, and transform SERP data in standard Unix pipelines. This tutorial covers the essential patterns: basic search-to-stdout, filtering by position, extracting URLs for wget, and building monitoring scripts. Each search call costs $0.005.
Prerequisites
- curl installed (standard on macOS/Linux)
- jq installed (brew install jq or apt install jq)
- A Scavio API key from scavio.dev
- Basic familiarity with shell pipes
Walkthrough
Step 1: Basic search from the command line
Call the API with curl and pipe through jq to get formatted results. Set your API key as an environment variable first.
export SCAVIO_API_KEY="your_key_here"
# Basic search, pretty-printed:
curl -s -X POST https://api.scavio.dev/api/v1/search \
-H "x-api-key: $SCAVIO_API_KEY" \
-H "Content-Type: application/json" \
-d '{"query": "best python frameworks 2026", "country_code": "us"}' \
| jq '.organic_results[:5][] | {title, link}'Step 2: Extract just URLs for downstream tools
Pull out only the URLs from search results and pipe them to other tools like xargs, wget, or grep.
# Get URLs only, one per line:
curl -s -X POST https://api.scavio.dev/api/v1/search \
-H "x-api-key: $SCAVIO_API_KEY" \
-H "Content-Type: application/json" \
-d '{"query": "site:github.com python search api", "country_code": "us"}' \
| jq -r '.organic_results[].link'
# Pipe to grep to filter specific domains:
# ... | grep 'github.com' | head -5
# Pipe to xargs to open in browser:
# ... | head -3 | xargs openStep 3: Build a reusable search function for your shell
Add a function to your .bashrc or .zshrc so you can search from anywhere in your terminal.
# Add to ~/.bashrc or ~/.zshrc:
search() {
curl -s -X POST https://api.scavio.dev/api/v1/search \
-H "x-api-key: $SCAVIO_API_KEY" \
-H "Content-Type: application/json" \
-d "{\"query\": \"$*\", \"country_code\": \"us\"}" \
| jq -r '.organic_results[:5][] | "\(.position). \(.title)\n \(.link)\n \(.snippet // "")\n"'
}
# Usage:
# search best crm software 2026
# search site:reddit.com serp api comparisonStep 4: Batch search from a file of queries
Read keywords from a file and search each one, collecting results in a TSV file for spreadsheet import.
# keywords.txt (one per line):
# best crm 2026
# crm pricing comparison
# hubspot alternative
# Batch search to TSV:
while IFS= read -r query; do
curl -s -X POST https://api.scavio.dev/api/v1/search \
-H "x-api-key: $SCAVIO_API_KEY" \
-H "Content-Type: application/json" \
-d "{\"query\": \"$query\", \"country_code\": \"us\"}" \
| jq -r --arg q "$query" \
'.organic_results[:3][] | [$q, (.position|tostring), .title, .link] | @tsv'
sleep 0.5
done < keywords.txt > results.tsv
echo "Results saved to results.tsv"Step 5: Monitor a keyword and alert on ranking changes
Create a cron-friendly script that checks if your site appears in the top 10 for a keyword and sends an alert if it drops.
#!/bin/bash
# monitor_rank.sh - check ranking and alert on changes
KEYWORD="$1"
TARGET="$2"
POSITION=$(curl -s -X POST https://api.scavio.dev/api/v1/search \
-H "x-api-key: $SCAVIO_API_KEY" \
-H "Content-Type: application/json" \
-d "{\"query\": \"$KEYWORD\", \"country_code\": \"us\"}" \
| jq -r --arg t "$TARGET" \
'[.organic_results[] | select(.link | contains($t))] | .[0].position // "not found"')
echo "$(date -I) | $KEYWORD | Position: $POSITION"
# Usage: ./monitor_rank.sh "best crm 2026" "yoursite.com"
# Add to cron: 0 8 * * * /path/to/monitor_rank.sh "best crm 2026" "yoursite.com" >> /var/log/rank.logPython Example
import os, requests, subprocess, json
API_KEY = os.environ['SCAVIO_API_KEY']
def search(query: str) -> dict:
resp = requests.post('https://api.scavio.dev/api/v1/search',
headers={'x-api-key': API_KEY, 'Content-Type': 'application/json'},
json={'query': query, 'country_code': 'us'})
return resp.json()
def search_to_tsv(queries: list, output: str = 'results.tsv'):
with open(output, 'w') as f:
f.write('query\tposition\ttitle\turl\n')
for q in queries:
data = search(q)
for r in data.get('organic_results', [])[:3]:
f.write(f'{q}\t{r["position"]}\t{r["title"]}\t{r["link"]}\n')
print(f'Saved {output}')
search_to_tsv(['best crm 2026', 'crm pricing comparison'])JavaScript Example
const API_KEY = process.env.SCAVIO_API_KEY;
async function search(query) {
const resp = await fetch('https://api.scavio.dev/api/v1/search', {
method: 'POST',
headers: { 'x-api-key': API_KEY, 'Content-Type': 'application/json' },
body: JSON.stringify({ query, country_code: 'us' })
});
return resp.json();
}
async function main() {
const queries = ['best crm 2026', 'crm pricing comparison'];
for (const q of queries) {
const data = await search(q);
(data.organic_results || []).slice(0, 3).forEach(r => {
console.log(`${q}\t${r.position}\t${r.title}\t${r.link}`);
});
}
}
main().catch(console.error);Expected Output
1. Top Python Web Frameworks in 2026
https://example.com/python-frameworks
Django, FastAPI, and Flask remain the top choices...
2. FastAPI vs Django: Which Should You Use?
https://example.com/fastapi-django
FastAPI excels at async APIs while Django offers...
2026-05-13 | best crm 2026 | Position: 4
Results saved to results.tsv