Tutorial

How to Build a Search Backend Failover Chain

Learn how to build a failover chain that routes search queries across multiple platforms automatically, ensuring your AI agent never loses search capability.

Production AI agents that depend on a single search backend face complete output degradation during outages. A failover chain routes queries through multiple search platforms in priority order, falling back automatically when the primary returns errors or empty results. This tutorial builds a failover chain using Scavio's multi-platform API, where a single key covers Google, Reddit, YouTube, Amazon, and Walmart. The same pattern works for vendor-level redundancy by adding a secondary provider as the final fallback.

Prerequisites

  • Python 3.8+ or Node.js 18+ installed
  • requests library (Python) or built-in fetch (JS)
  • A Scavio API key from scavio.dev
  • Basic understanding of try-catch error handling

Walkthrough

Step 1: Define your platform priority

List search platforms in order of preference for your use case. Google covers most queries; Reddit and YouTube provide depth for discussion and video topics.

Python
PLATFORMS = ['google', 'reddit', 'youtube']
API_KEY = os.environ['SCAVIO_API_KEY']

Step 2: Build the failover function

Iterate through platforms, returning results from the first one that succeeds with non-empty results.

Python
import requests, os

def failover_search(query: str, min_results: int = 1) -> dict:
    for platform in PLATFORMS:
        try:
            resp = requests.post('https://api.scavio.dev/api/v1/search',
                headers={'x-api-key': API_KEY},
                json={'platform': platform, 'query': query}, timeout=10)
            resp.raise_for_status()
            data = resp.json()
            results = data.get('organic', [])
            if len(results) >= min_results:
                return {'platform': platform, 'results': results}
        except (requests.RequestException, ValueError):
            continue
    return {'platform': 'none', 'results': []}

Step 3: Add logging for observability

Log which platform served each query so you can track failover frequency and identify degraded backends.

Python
import logging
logger = logging.getLogger('search_failover')

def logged_failover_search(query: str) -> dict:
    for platform in PLATFORMS:
        try:
            resp = requests.post('https://api.scavio.dev/api/v1/search',
                headers={'x-api-key': API_KEY},
                json={'platform': platform, 'query': query}, timeout=10)
            data = resp.json()
            results = data.get('organic', [])
            if results:
                logger.info(f'Query "{query}" served by {platform} ({len(results)} results)')
                return {'platform': platform, 'results': results}
            logger.warning(f'{platform} returned 0 results for "{query}"')
        except Exception as e:
            logger.error(f'{platform} failed for "{query}": {e}')
    logger.error(f'All platforms failed for "{query}"')
    return {'platform': 'none', 'results': []}

Step 4: Normalize the output

Ensure the downstream LLM gets the same format regardless of which platform answered.

Python
def normalize_result(result: dict, platform: str) -> dict:
    return {
        'title': result.get('title', ''),
        'snippet': result.get('snippet', result.get('description', '')),
        'url': result.get('link', result.get('url', '')),
        'source_platform': platform
    }

def search_for_agent(query: str) -> list:
    data = logged_failover_search(query)
    return [normalize_result(r, data['platform']) for r in data['results'][:5]]

Python Example

Python
import requests, os

PLATFORMS = ['google', 'reddit', 'youtube']
H = {'x-api-key': os.environ['SCAVIO_API_KEY']}

def failover_search(query):
    for p in PLATFORMS:
        try:
            r = requests.post('https://api.scavio.dev/api/v1/search',
                headers=H, json={'platform': p, 'query': query}, timeout=10)
            results = r.json().get('organic', [])
            if results: return {'platform': p, 'results': results[:5]}
        except: continue
    return {'platform': 'none', 'results': []}

print(failover_search('best crm for startups'))

JavaScript Example

JavaScript
const PLATFORMS = ['google', 'reddit', 'youtube'];
async function failoverSearch(query) {
  for (const p of PLATFORMS) {
    try {
      const r = await fetch('https://api.scavio.dev/api/v1/search', {
        method: 'POST',
        headers: {'x-api-key': process.env.SCAVIO_API_KEY, 'Content-Type': 'application/json'},
        body: JSON.stringify({platform: p, query})
      });
      const data = await r.json();
      if (data.organic?.length) return {platform: p, results: data.organic.slice(0, 5)};
    } catch { continue; }
  }
  return {platform: 'none', results: []};
}

Expected Output

JSON
A failover search function that automatically routes through Google, Reddit, and YouTube in order, returning normalized results from the first platform that succeeds.

Related Tutorials

Frequently Asked Questions

Most developers complete this tutorial in 15 to 30 minutes. You will need a Scavio API key (free tier works) and a working Python or JavaScript environment.

Python 3.8+ or Node.js 18+ installed. requests library (Python) or built-in fetch (JS). A Scavio API key from scavio.dev. Basic understanding of try-catch error handling. A Scavio API key gives you 500 free credits per month.

Yes. The free tier includes 500 credits per month, which is more than enough to complete this tutorial and prototype a working solution.

Scavio has a native LangChain package (langchain-scavio), an MCP server, and a plain REST API that works with any HTTP client. This tutorial uses the raw REST API, but you can adapt to your framework of choice.

Start Building

Learn how to build a failover chain that routes search queries across multiple platforms automatically, ensuring your AI agent never loses search capability.