local-seorank-trackinguule

UULE Is Unreliable: API-Based Local Rank Tracking

Google's UULE parameter is hit or miss for localized ranking. Use SERP API geo-targeting instead.

7 min

UULE parameters for geo-targeted Google searches are unreliable in 2026 because Google rotates the encoding and does not document the mapping. A SERP API with built-in geo-targeting via a country_code parameter handles localization server-side, removing the UULE guesswork entirely.

Why UULE breaks

UULE is a Base64-encoded string that tells Google to simulate a search from a specific location. The format is undocumented. Community-maintained mapping tables exist, but Google periodically changes how UULE values map to canonical location IDs. A UULE that worked in January might return non-localized results by March. Developers building rank tracking tools discover this the hard way when their location-specific data suddenly looks identical across all geos.

The UULE approach (fragile)

Python
import base64, hashlib, requests

# This is the community-reverse-engineered UULE format
# Google can change this at any time
def generate_uule(location_name: str) -> str:
    secret = b'ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789-_'
    loc_bytes = location_name.encode('utf-8')
    hashed = hashlib.md5(loc_bytes).digest()
    char = secret[len(loc_bytes) % 64]
    encoded = base64.b64encode(bytes([8, 2, 16, 1, 34, len(loc_bytes)]) + loc_bytes).decode()
    return f'w+CAIQICI{chr(char)}{encoded}'

# Problems with this approach:
# 1. The encoding format is reverse-engineered, not documented
# 2. Google changes canonical location names without notice
# 3. Some locations map to multiple UULE values
# 4. No way to verify the UULE actually targeted the right location
uule = generate_uule('New York,New York,United States')
resp = requests.get(f'https://www.google.com/search?q=plumber+near+me&uule={uule}')
# Did this actually return New York results? You have to manually verify.

The API approach (reliable)

A SERP API handles geo-targeting on the server side. You pass a country code or location parameter, and the API ensures the search runs from that geography using its own proxy infrastructure. No UULE encoding, no maintenance when Google changes formats.

Python
import requests, os

API = 'https://api.scavio.dev/api/v1/search'
H = {'x-api-key': os.environ['SCAVIO_API_KEY']}

def rank_check(keyword: str, domain: str, country: str = 'us'):
    resp = requests.post(API, headers=H, json={
        'platform': 'google',
        'query': keyword,
        'country_code': country,
    }, timeout=15)
    results = resp.json().get('organic_results', [])
    for i, r in enumerate(results):
        if domain in r.get('link', ''):
            return {'position': i + 1, 'title': r.get('title'), 'url': r.get('link')}
    return {'position': None, 'title': None, 'url': None}

# Track the same keyword across multiple countries
keywords = ['project management software', 'best crm for startups']
countries = ['us', 'gb', 'de', 'fr', 'au']

for kw in keywords:
    for cc in countries:
        result = rank_check(kw, 'mysite.com', cc)
        pos = result['position'] or 'Not found'
        print(f"[{cc.upper()}] '{kw}': position {pos}")

Building a localized rank tracker

A daily rank tracker for 50 keywords across 5 countries = 250 queries/day. At $0.005/query that is $1.25/day or $37.50/mo. The same setup with UULE means maintaining a location mapping table that might break any week, plus proxy infrastructure to avoid rate limits on direct Google requests.

Python
import csv
from datetime import datetime

def daily_rank_report(keywords: list[str], domain: str, countries: list[str]):
    today = datetime.now().strftime('%Y-%m-%d')
    rows = []
    for kw in keywords:
        for cc in countries:
            result = rank_check(kw, domain, cc)
            rows.append({
                'date': today,
                'keyword': kw,
                'country': cc,
                'position': result['position'],
                'title': result['title'],
            })
    with open(f'ranks_{today}.csv', 'w', newline='') as f:
        w = csv.DictWriter(f, fieldnames=rows[0].keys())
        w.writeheader()
        w.writerows(rows)
    found = sum(1 for r in rows if r['position'])
    print(f"Tracked {len(rows)} keyword-country pairs, {found} ranking")
    return rows

SERP API pricing for rank tracking

SerpAPI charges $75/mo for 5,000 searches ($0.015/search). Bright Data SERP API runs $1.50/1K requests ($0.0015/request) but requires more setup. DataForSEO standard is $0.0006/request but live mode (which gives fresh results) is $0.002. Scavio runs $0.005/query with geo-targeting included. For a 250 queries/day rank tracker, monthly costs: SerpAPI $112.50, DataForSEO live $15, Bright Data $11.25, Scavio $37.50. DataForSEO and Bright Data are cheaper at this volume; Scavio's advantage is the unified multi-platform API if you also need Amazon or shopping data in the same pipeline.

Bottom line

Stop maintaining UULE mappings. Any SERP API with built-in geo-targeting removes the fragility. Pick one based on your volume and whether you need multi-platform coverage beyond Google.