An r/crewai post described an async SQLite cache returning JSON in 50ms for repeated queries. This tutorial wires the same pattern over Scavio's typed JSON output.
Prerequisites
- Python 3.10+
- Scavio API key
Walkthrough
Step 1: Initialize the cache table
Key = query + surface; value = JSON; TTL via timestamp.
import sqlite3, time
conn = sqlite3.connect('cache.db')
conn.execute('CREATE TABLE IF NOT EXISTS cache(key TEXT PRIMARY KEY, payload TEXT, ts REAL)')Step 2: Cache lookup with TTL
Default TTL: 1 hour for SERP, 6 hours for static pages.
def get(key, ttl=3600):
row = conn.execute('SELECT payload, ts FROM cache WHERE key=?', (key,)).fetchone()
if row and time.time() - row[1] < ttl:
import json
return json.loads(row[0])
return NoneStep 3: Cache write on miss
Upsert pattern.
def set_(key, payload):
import json
conn.execute('INSERT OR REPLACE INTO cache VALUES (?, ?, ?)', (key, json.dumps(payload), time.time()))
conn.commit()Step 4: Wrap Scavio search call
Cache key includes endpoint + query.
import requests, os
API_KEY = os.environ['SCAVIO_API_KEY']
def search_cached(query, ttl=3600):
key = f'search::{query}'
cached = get(key, ttl)
if cached: return cached
data = requests.post('https://api.scavio.dev/api/v1/search',
headers={'x-api-key': API_KEY}, json={'query': query}).json()
set_(key, data)
return dataStep 5: Measure hit rate
Track hits to know whether the cache is paying off.
stats = {'hits': 0, 'misses': 0}
# increment in get() when row exists, increment misses otherwise.Python Example
# Use search_cached('your query') as the drop-in replacement for direct Scavio calls.
result = search_cached('mcp server best practices 2026')
print(len(result.get('organic_results', [])))JavaScript Example
// JS equivalent uses better-sqlite3 with the same key/value/ttl shape.Expected Output
Cache hit returns in single-digit ms. Cache miss costs one Scavio query. For agents that repeat queries, hit rate of 40-70% is common.