Tutorial

How to Build an Autonomous Research Agent with Scavio

Build a multi-step research agent in Python that uses Scavio to search Google, fetch news, and synthesize findings into a structured report automatically.

An autonomous research agent accepts a topic, performs multiple web searches, collects and deduplicates sources, then synthesizes findings into a structured report — all without human intervention. This pattern is valuable for market intelligence, academic literature summaries, and competitive analysis. This tutorial builds such an agent using pure Python and the Scavio API, without relying on a full agent framework. The agent issues sequential search queries, extracts key facts from SERP snippets, and formats them as a markdown report.

Prerequisites

  • Python 3.10 or higher
  • requests library installed
  • A Scavio API key
  • An OpenAI API key for the synthesis step (optional)

Walkthrough

Step 1: Define the research plan

Break the research topic into a list of search queries. A good research agent generates 3-5 queries covering different aspects of the topic.

Python
def make_queries(topic: str) -> list[str]:
    return [
        topic,
        f"{topic} latest developments 2026",
        f"{topic} key players market",
        f"{topic} challenges limitations",
    ]

Step 2: Collect search results for each query

Call the Scavio API for each query and merge the organic results into a single deduplicated list keyed by URL.

Python
def collect_results(queries: list[str]) -> dict:
    seen = {}
    for query in queries:
        data = search_google(query)
        for r in data.get("organic_results", []):
            seen[r["link"]] = r
    return seen

Step 3: Extract facts from snippets

Build a flat list of (title, snippet, url) tuples from the deduplicated results for use in report generation.

Python
def extract_facts(results: dict) -> list[tuple]:
    facts = []
    for url, r in results.items():
        if r.get("snippet"):
            facts.append((r["title"], r["snippet"], url))
    return facts

Step 4: Write the report

Format the collected facts into a plain markdown report. Optionally pass the facts to an LLM for synthesis.

Python
def write_report(topic: str, facts: list[tuple]) -> str:
    lines = [f"# Research Report: {topic}\n"]
    for title, snippet, url in facts[:10]:
        lines.append(f"## {title}")
        lines.append(snippet)
        lines.append(f"Source: {url}\n")
    return "\n".join(lines)

Python Example

Python
import os
import requests

API_KEY = os.environ.get("SCAVIO_API_KEY", "your_scavio_api_key")
ENDPOINT = "https://api.scavio.dev/api/v1/search"

def search_google(query: str) -> dict:
    r = requests.post(ENDPOINT, headers={"x-api-key": API_KEY},
                      json={"query": query, "country_code": "us"})
    r.raise_for_status()
    return r.json()

def research(topic: str) -> str:
    queries = [topic, f"{topic} 2026 trends", f"{topic} challenges"]
    seen = {}
    for q in queries:
        for r in search_google(q).get("organic_results", []):
            seen[r["link"]] = r
    lines = [f"# {topic}\n"]
    for r in list(seen.values())[:10]:
        lines.append(f"## {r['title']}")
        lines.append(r.get("snippet", ""))
        lines.append(f"{r['link']}\n")
    return "\n".join(lines)

if __name__ == "__main__":
    report = research("large language model inference optimization")
    print(report)

JavaScript Example

JavaScript
const API_KEY = process.env.SCAVIO_API_KEY || "your_scavio_api_key";
const ENDPOINT = "https://api.scavio.dev/api/v1/search";

async function searchGoogle(query) {
  const res = await fetch(ENDPOINT, {
    method: "POST",
    headers: { "x-api-key": API_KEY, "Content-Type": "application/json" },
    body: JSON.stringify({ query, country_code: "us" })
  });
  return res.json();
}

async function research(topic) {
  const queries = [topic, `${topic} 2026 trends`, `${topic} challenges`];
  const seen = new Map();
  for (const q of queries) {
    const data = await searchGoogle(q);
    for (const r of data.organic_results || []) {
      seen.set(r.link, r);
    }
  }
  const results = [...seen.values()].slice(0, 10);
  return results.map(r => `## ${r.title}\n${r.snippet || ""}\n${r.link}`).join("\n\n");
}

research("AI inference optimization").then(console.log).catch(console.error);

Expected Output

JSON
# Research Report: AI inference optimization

## Faster LLM Inference Techniques in 2026
Quantization, speculative decoding, and model distillation have reduced...
Source: https://example.com/llm-inference

## Key Players in AI Inference Hardware
NVIDIA, AMD, and Groq continue to dominate...
Source: https://example.com/inference-hardware

Related Tutorials

Frequently Asked Questions

Most developers complete this tutorial in 15 to 30 minutes. You will need a Scavio API key (free tier works) and a working Python or JavaScript environment.

Python 3.10 or higher. requests library installed. A Scavio API key. An OpenAI API key for the synthesis step (optional). A Scavio API key gives you 500 free credits per month.

Yes. The free tier includes 500 credits per month, which is more than enough to complete this tutorial and prototype a working solution.

Scavio has a native LangChain package (langchain-scavio), an MCP server, and a plain REST API that works with any HTTP client. This tutorial uses the raw REST API, but you can adapt to your framework of choice.

Start Building

Build a multi-step research agent in Python that uses Scavio to search Google, fetch news, and synthesize findings into a structured report automatically.