Overview
Developers alt-tab to browsers 50+ times daily to look up API docs, error messages, and package versions. This workflow adds a shell pipeline that searches via Scavio, caches results locally, and enriches with related queries. Subsequent lookups for the same topic hit the local cache. Each fresh query costs $0.005.
Trigger
On-demand from terminal via shell alias or function.
Schedule
On-demand from terminal
Workflow Steps
Check Local Cache for Recent Results
Before making an API call, check the local cache file for results from the same query within the last 24 hours.
Execute Search if Cache Miss
If no cached results exist, query Scavio for the search term. Return structured results to stdout.
Cache Results Locally
Write search results to a local JSON cache file with a timestamp. Future queries for the same term skip the API call.
Enrich with Related Queries
Optionally run related searches (e.g., query + 'changelog', query + 'migration guide') and cache those too.
Python Implementation
#!/usr/bin/env python3
import requests, os, json, sys, hashlib
from pathlib import Path
from datetime import datetime, timedelta
API_KEY = os.environ["SCAVIO_API_KEY"]
CACHE_DIR = Path.home() / ".search_cache"
CACHE_DIR.mkdir(exist_ok=True)
CACHE_TTL = timedelta(hours=24)
def cache_key(query: str) -> str:
return hashlib.md5(query.encode()).hexdigest()
def search_with_cache(query: str) -> dict:
key = cache_key(query)
cache_file = CACHE_DIR / f"{key}.json"
if cache_file.exists():
cached = json.loads(cache_file.read_text())
if datetime.fromisoformat(cached["timestamp"]) > datetime.now() - CACHE_TTL:
return cached["data"]
resp = requests.post(
"https://api.scavio.dev/api/v1/search",
headers={"x-api-key": API_KEY, "Content-Type": "application/json"},
json={"query": query, "country_code": "us"},
timeout=10,
)
data = resp.json()
cache_file.write_text(json.dumps({"timestamp": datetime.now().isoformat(), "data": data}))
return data
if len(sys.argv) > 1:
query = " ".join(sys.argv[1:])
results = search_with_cache(query)
for r in results.get("organic_results", [])[:5]:
print(f"[{r.get('position')}] {r.get('title', '')}")
print(f" {r.get('snippet', '')}")JavaScript Implementation
const fs = require('fs'), path = require('path'), crypto = require('crypto');
const CACHE_DIR = path.join(require('os').homedir(), '.search_cache');
if (!fs.existsSync(CACHE_DIR)) fs.mkdirSync(CACHE_DIR);
const H = {'x-api-key': process.env.SCAVIO_API_KEY, 'Content-Type': 'application/json'};
async function searchWithCache(query) {
const key = crypto.createHash('md5').update(query).digest('hex');
const cacheFile = path.join(CACHE_DIR, key+'.json');
if (fs.existsSync(cacheFile)) {
const cached = JSON.parse(fs.readFileSync(cacheFile,'utf8'));
if (Date.now() - new Date(cached.timestamp).getTime() < 86400000) return cached.data;
}
const r = await fetch('https://api.scavio.dev/api/v1/search', {method:'POST', headers:H, body:JSON.stringify({query, country_code:'us'})});
const data = await r.json();
fs.writeFileSync(cacheFile, JSON.stringify({timestamp:new Date().toISOString(), data}));
return data;
}
const q = process.argv.slice(2).join(' ');
const d = await searchWithCache(q);
(d.organic_results||[]).slice(0,5).forEach(r=>console.log('['+r.position+'] '+r.title));Platforms Used
Web search with knowledge graph, PAA, and AI overviews