Tutorial

How to Get Google Maps Data Without Scraping

Extract Google Maps business data including names, ratings, addresses, and reviews through a search API instead of scraping. No proxies needed.

Scraping Google Maps directly is fragile and risks IP bans. Google Maps data appears in search results as local pack listings with business name, rating, address, phone, and hours. By searching through the Scavio API at $0.005 per request, you get structured local business data without managing proxies, headless browsers, or CAPTCHA solvers. This tutorial builds a local business data extractor that pulls Maps-quality data from search results.

Prerequisites

  • Python 3.9+ installed
  • requests library installed
  • A Scavio API key from scavio.dev
  • A list of business categories or locations to search

Walkthrough

Step 1: Search for local businesses via the search API

Query for local businesses and extract the local pack results. These contain the same data you would scrape from Google Maps: name, rating, address, and more.

Python
import os, requests, json

SCAVIO_KEY = os.environ['SCAVIO_API_KEY']
URL = 'https://api.scavio.dev/api/v1/search'
H = {'x-api-key': SCAVIO_KEY, 'Content-Type': 'application/json'}

def search_local_businesses(query: str, location: str = 'us') -> dict:
    """Search for local businesses and extract structured data."""
    resp = requests.post(URL, headers=H,
        json={'query': query, 'country_code': location, 'num_results': 10})
    resp.raise_for_status()
    data = resp.json()
    return {
        'local_results': data.get('local_results', []),
        'organic_results': data.get('organic_results', []),
        'knowledge_graph': data.get('knowledge_graph', {}),
    }

result = search_local_businesses('best coffee shops in Austin TX')
local = result['local_results']
print(f'Found {len(local)} local results')
for biz in local[:5]:
    print(f"  {biz.get('title', 'N/A')}")
    print(f"    Rating: {biz.get('rating', 'N/A')} ({biz.get('reviews', 'N/A')} reviews)")
    print(f"    Address: {biz.get('address', 'N/A')}")

Step 2: Extract structured business data

Parse the search results to extract clean business records. Combine local pack data with organic result snippets for richer profiles.

Python
def extract_business_data(query: str, location: str = 'us') -> list:
    """Extract structured business records from search results."""
    data = search_local_businesses(query, location)
    businesses = []
    # Extract from local results (Maps data)
    for biz in data.get('local_results', []):
        businesses.append({
            'name': biz.get('title', ''),
            'rating': biz.get('rating', None),
            'reviews_count': biz.get('reviews', None),
            'address': biz.get('address', ''),
            'phone': biz.get('phone', ''),
            'hours': biz.get('hours', ''),
            'type': biz.get('type', ''),
            'source': 'local_pack',
        })
    # Extract from organic results
    for result in data.get('organic_results', []):
        snippet = result.get('snippet', '')
        rich = result.get('rich_snippet', {})
        if rich:
            businesses.append({
                'name': result.get('title', ''),
                'rating': rich.get('rating', None),
                'reviews_count': rich.get('reviews', None),
                'address': '',
                'phone': '',
                'url': result.get('link', ''),
                'source': 'organic_rich',
            })
    return businesses

businesses = extract_business_data('plumbers in Denver CO')
print(f'Extracted {len(businesses)} businesses')
for b in businesses[:5]:
    print(f"  {b['name']} - Rating: {b['rating']} ({b['source']})"
          f"{'  ' + b['address'] if b['address'] else ''}")

Step 3: Batch extract across multiple categories

Search multiple business categories in a location to build a comprehensive local business database. Rate limit to stay within API guidelines.

Python
import time

def batch_extract(categories: list, location: str, city: str) -> list:
    """Extract businesses across multiple categories."""
    all_businesses = []
    for category in categories:
        query = f'{category} in {city}'
        print(f'Searching: {query}')
        businesses = extract_business_data(query, location)
        for b in businesses:
            b['category'] = category
            b['city'] = city
        all_businesses.extend(businesses)
        time.sleep(0.5)  # Rate limiting
    # Deduplicate by name
    seen = set()
    unique = []
    for b in all_businesses:
        key = b['name'].lower().strip()
        if key and key not in seen:
            seen.add(key)
            unique.append(b)
    return unique

categories = ['restaurants', 'dentists', 'auto repair', 'hair salons']
businesses = batch_extract(categories, 'us', 'Portland OR')
print(f'\nTotal unique businesses: {len(businesses)}')
print(f'Cost: {len(categories)} searches = ${len(categories) * 0.005:.3f}')
for cat in categories:
    count = len([b for b in businesses if b.get('category') == cat])
    print(f'  {cat}: {count}')

Step 4: Export to CSV for analysis

Save the extracted business data to CSV for use in spreadsheets, CRM imports, or further analysis.

Python
import csv

def export_businesses(businesses: list, filename: str = 'local_businesses.csv'):
    if not businesses:
        print('No businesses to export')
        return
    fieldnames = ['name', 'category', 'city', 'rating', 'reviews_count',
                  'address', 'phone', 'hours', 'type', 'source']
    with open(filename, 'w', newline='') as f:
        writer = csv.DictWriter(f, fieldnames=fieldnames, extrasaction='ignore')
        writer.writeheader()
        writer.writerows(businesses)
    # Summary stats
    rated = [b for b in businesses if b.get('rating')]
    avg_rating = sum(float(b['rating']) for b in rated) / len(rated) if rated else 0
    print(f'Exported {len(businesses)} businesses to {filename}')
    print(f'  With ratings: {len(rated)}')
    print(f'  Average rating: {avg_rating:.1f}')
    print(f'  Categories: {len(set(b.get("category","") for b in businesses))}')

export_businesses(businesses, 'portland_businesses.csv')

Python Example

Python
import os, requests, csv, time

SCAVIO_KEY = os.environ['SCAVIO_API_KEY']
H = {'x-api-key': SCAVIO_KEY, 'Content-Type': 'application/json'}

def get_local_businesses(query):
    resp = requests.post('https://api.scavio.dev/api/v1/search', headers=H,
        json={'query': query, 'country_code': 'us', 'num_results': 10})
    return resp.json().get('local_results', [])

def extract_and_export(categories, city, output='businesses.csv'):
    all_biz = []
    for cat in categories:
        results = get_local_businesses(f'{cat} in {city}')
        for r in results:
            all_biz.append({'name': r.get('title',''), 'category': cat,
                'rating': r.get('rating',''), 'address': r.get('address','')})
        time.sleep(0.3)
    with open(output, 'w', newline='') as f:
        w = csv.DictWriter(f, fieldnames=['name','category','rating','address'])
        w.writeheader()
        w.writerows(all_biz)
    print(f'Exported {len(all_biz)} businesses')

extract_and_export(['restaurants', 'dentists'], 'Austin TX')

JavaScript Example

JavaScript
const SCAVIO_KEY = process.env.SCAVIO_API_KEY;

async function getLocalBusinesses(query) {
  const resp = await fetch('https://api.scavio.dev/api/v1/search', {
    method: 'POST',
    headers: { 'x-api-key': SCAVIO_KEY, 'Content-Type': 'application/json' },
    body: JSON.stringify({ query, country_code: 'us', num_results: 10 })
  });
  return (await resp.json()).local_results || [];
}

async function extractBusinesses(categories, city) {
  const all = [];
  for (const cat of categories) {
    const results = await getLocalBusinesses(`${cat} in ${city}`);
    results.forEach(r => all.push({ name: r.title, category: cat,
      rating: r.rating, address: r.address }));
  }
  console.log(`Found ${all.length} businesses`);
  all.forEach(b => console.log(`  ${b.name} (${b.rating}) - ${b.category}`));
}

extractBusinesses(['restaurants', 'dentists'], 'Austin TX');

Expected Output

JSON
Found 8 local results
  Houndstooth Coffee
    Rating: 4.6 (342 reviews)
    Address: 401 Congress Ave, Austin, TX
  Merit Coffee
    Rating: 4.7 (289 reviews)
    Address: 222 W 2nd St, Austin, TX

Total unique businesses: 24
Cost: 4 searches = $0.020
  restaurants: 8
  dentists: 6
  auto repair: 5
  hair salons: 5

Exported 24 businesses to portland_businesses.csv

Related Tutorials

Frequently Asked Questions

Most developers complete this tutorial in 15 to 30 minutes. You will need a Scavio API key (free tier works) and a working Python or JavaScript environment.

Python 3.9+ installed. requests library installed. A Scavio API key from scavio.dev. A list of business categories or locations to search. A Scavio API key gives you 250 free credits per month.

Yes. The free tier includes 250 credits per month, which is more than enough to complete this tutorial and prototype a working solution.

Scavio has a native LangChain package (langchain-scavio), an MCP server, and a plain REST API that works with any HTTP client. This tutorial uses the raw REST API, but you can adapt to your framework of choice.

Start Building

Extract Google Maps business data including names, ratings, addresses, and reviews through a search API instead of scraping. No proxies needed.