migrationserpertutorial

Upgrading from Serper to a Multi-Platform Search API

Moving beyond Serper's Google-only coverage to a multi-platform API that includes Amazon, YouTube, and Reddit.

7 min read

Serper provides fast, affordable Google search results via API. It does one thing well -- but only one thing. As your application grows beyond Google-only search, you need product data from Amazon, video results from YouTube, pricing from Walmart, or community discussions from Reddit. Rather than adding a separate API for each platform, migrating to a multi-platform search API consolidates your stack.

What Changes in the Request

Serper uses a POST request with a JSON body containing qfor the query and an API key header. Scavio follows the same pattern with minor field name changes:

Python
# Serper
response = requests.post(
    "https://google.serper.dev/search",
    headers={"X-API-KEY": "SERPER_KEY"},
    json={
        "q": "best noise cancelling headphones",
        "gl": "us",
        "hl": "en"
    }
)

# Scavio equivalent
response = requests.post(
    "https://api.scavio.dev/api/v1/search",
    headers={"x-api-key": "YOUR_SCAVIO_KEY"},
    json={
        "platform": "google",
        "query": "best noise cancelling headphones",
        "country": "us",
        "language": "en"
    }
)

The changes are minimal: q becomes query,gl becomes country, hl becomeslanguage, and you add a platform field. The header key is x-api-key (lowercase) instead of Serper'sX-API-KEY.

Response Format Mapping

Serper and Scavio both return structured JSON with organic results, knowledge graph, and People Also Ask data. The nesting is slightly different:

Python
# Serper response
for result in serper_response["organic"]:
    print(result["title"], result["link"])

knowledge = serper_response.get("knowledgeGraph", {})
paa = serper_response.get("peopleAlsoAsk", [])

# Scavio response
for result in scavio_response["data"]["organic"]:
    print(result["title"], result["link"])

knowledge = scavio_response["data"].get("knowledgeGraph", {})
paa = scavio_response["data"].get("peopleAlsoAsk", [])

The only structural difference is that Scavio nests everything under a data key. Field names within the result objects --title, link, snippet,position -- are the same.

Going Beyond Google

The main reason to migrate is access to additional platforms through the same API. With Serper, adding Amazon search means integrating a completely separate API. With Scavio, you change one field:

Python
def search(platform, query, **kwargs):
    return requests.post(
        "https://api.scavio.dev/api/v1/search",
        headers={"x-api-key": API_KEY},
        json={"platform": platform, "query": query, **kwargs}
    ).json()

# All through the same function
google = search("google", "wireless headphones review")
amazon = search("amazon", "wireless headphones")
youtube = search("youtube", "wireless headphones comparison")
walmart = search("walmart", "wireless headphones")
reddit = search("reddit", "wireless headphones recommendation")

One API key, one endpoint, one response parser. Each platform returns results under the same data structure with platform-specific fields where relevant (e.g., price for Amazon and Walmart products).

Agent and MCP Integration

If you use Serper as a tool in an LLM agent, migrating to Scavio also gives you MCP support. Instead of maintaining a custom tool wrapper, you can connect Scavio's MCP server directly to Claude, Cursor, VS Code, or any MCP-compatible client:

JSON
{
  "mcpServers": {
    "scavio": {
      "type": "http",
      "url": "https://mcp.scavio.dev/mcp",
      "headers": {
        "x-api-key": "YOUR_SCAVIO_KEY"
      }
    }
  }
}

This gives your AI assistant access to all search tools -- Google, Amazon, YouTube, Walmart, Reddit -- with zero custom code. See the MCP integration guide for full setup instructions.

Migration Checklist

  • Replace the Serper URL with https://api.scavio.dev/api/v1/search
  • Change the header from X-API-KEY to x-api-key
  • Rename q to query, gl to country, hl to language
  • Add platform: "google" to each request body
  • Update response parsing to access data.organic instead of organic
  • Add multi-platform searches where relevant in your application
  • Test with your existing query set to verify result quality parity