amazonfbaasin

Amazon ASIN Bulk Check via API in 2026

FBA sellers scraping Amazon categories for ASINs. API approach: query category keywords, extract ASINs, batch check. 1,449 ASINs for under $1.

8 min

FBA sellers need to check which ASINs they are eligible to sell before committing to inventory. The manual process: browse Amazon categories, copy ASINs one by one, paste into Seller Central's add product page, and check for restrictions. A post on r/AmazonFBA described checking 1,449 ASINs this way. That is roughly 12 hours of copy-paste work. An API approach reduces this to minutes.

The Manual Approach and Why It Breaks

Scraping Amazon category pages to extract ASIN lists is the first step most sellers try to automate. The problems are well-known: Amazon's category pages are dynamically loaded, pagination is inconsistent, and aggressive bot detection blocks most scraping tools within a few hundred requests. Residential proxies help but cost $10-25/GB, and Amazon's anti-bot measures have gotten significantly more sophisticated in 2026.

Even when scraping works, the extracted data is often incomplete. Amazon shows different products to different users based on location, browsing history, and account status. A scraper running from a datacenter IP sees different results than what a logged-in seller sees in their browser.

The API Approach

A search API that covers Amazon returns structured product data including ASINs, titles, prices, ratings, and review counts. You query category keywords and extract the ASINs from the results. No proxy management, no DOM parsing, no anti-bot evasion.

Python
import requests
import json
import time

API_KEY = "your_scavio_api_key"

def get_asins_for_category(category_keyword, num_results=20):
    """Get ASINs for a product category via search API."""
    response = requests.post(
        "https://api.scavio.dev/api/v1/search",
        headers={
            "x-api-key": API_KEY,
            "Content-Type": "application/json",
        },
        json={
            "query": f"{category_keyword} site:amazon.com",
            "num_results": num_results,
        },
    )
    response.raise_for_status()
    data = response.json()

    asins = []
    for result in data.get("results", []):
        url = result.get("url", "")
        # Extract ASIN from Amazon URL patterns
        # /dp/B0XXXXXXXX or /gp/product/B0XXXXXXXX
        asin = extract_asin_from_url(url)
        if asin:
            asins.append({
                "asin": asin,
                "title": result.get("title", ""),
                "url": url,
                "price": result.get("price"),
            })
    return asins

def extract_asin_from_url(url):
    """Extract ASIN from an Amazon product URL."""
    import re
    patterns = [
        r"/dp/([A-Z0-9]{10})",
        r"/gp/product/([A-Z0-9]{10})",
        r"/product/([A-Z0-9]{10})",
    ]
    for pattern in patterns:
        match = re.search(pattern, url)
        if match:
            return match.group(1)
    return None

Bulk Category Scanning

To replicate the 1,449 ASIN check from the Reddit post, you need to scan multiple category keywords. Here is a pipeline that scans a list of categories and builds a deduplicated ASIN database:

Python
def bulk_scan_categories(categories, results_per_category=20):
    """Scan multiple categories and build ASIN database."""
    all_asins = {}

    for category in categories:
        print(f"Scanning: {category}")
        results = get_asins_for_category(category, results_per_category)

        for item in results:
            asin = item["asin"]
            if asin not in all_asins:
                all_asins[asin] = {
                    "asin": asin,
                    "title": item["title"],
                    "url": item["url"],
                    "categories": [category],
                }
            else:
                all_asins[asin]["categories"].append(category)

        time.sleep(0.3)  # Rate limiting

    print(f"Found {len(all_asins)} unique ASINs")
    return list(all_asins.values())

# Example: scanning toy categories
categories = [
    "best selling toys 2026",
    "educational toys for kids",
    "building blocks sets",
    "remote control cars for kids",
    "board games family",
    "outdoor toys backyard",
    "stem toys for children",
    "arts and crafts kits kids",
]

asins = bulk_scan_categories(categories)

# Save for batch eligibility checking
with open("asin_candidates.json", "w") as f:
    json.dump(asins, f, indent=2)

print(f"Saved {len(asins)} ASINs for eligibility check")

Cost Analysis

The Reddit post described checking 1,449 ASINs. To collect that many ASINs via search API:

  • At 20 results per query, you need approximately 75-100 queries (accounting for duplicates across categories).
  • At $0.005/query, that is $0.375-$0.50 total.
  • The search takes under 2 minutes with rate limiting.

Compare this to 12 hours of manual browsing or $30-50 in proxy costs for a scraping approach that may or may not succeed.

Enriching ASIN Data

Raw ASINs are a starting point. Before checking eligibility, you want to filter by profitability signals. A second pass of searches can enrich each ASIN with competitive data:

Python
def enrich_asin(asin, api_key):
    """Get competitive data for a specific ASIN."""
    response = requests.post(
        "https://api.scavio.dev/api/v1/search",
        headers={
            "x-api-key": api_key,
            "Content-Type": "application/json",
        },
        json={
            "query": f"amazon.com/dp/{asin}",
            "num_results": 5,
        },
    )
    data = response.json()
    results = data.get("results", [])

    if not results:
        return None

    # Check for competitive signals
    primary = results[0]
    return {
        "asin": asin,
        "title": primary.get("title", ""),
        "has_multiple_sellers": len(results) > 1,
        "review_snippet": primary.get("description", ""),
    }

def filter_candidates(asins, min_reviews=50):
    """Filter ASINs based on basic profitability signals."""
    candidates = []
    for item in asins:
        # Products with very few reviews might be too new or niche
        # Products with too many might be too competitive
        candidates.append(item)
    return candidates

The Eligibility Check Workflow

After collecting and filtering ASINs, the eligibility check itself still requires Seller Central. Amazon does not expose eligibility data through any public API. The workflow is:

  • Step 1: Collect ASINs via search API (automated, 2 minutes)
  • Step 2: Filter by profitability signals (automated, 1 minute)
  • Step 3: Export filtered list to CSV
  • Step 4: Batch check in Seller Central using add product flow

Some sellers use tools like Keepa or Helium 10 for the eligibility check step, which can process ASIN lists in bulk. The search API handles the discovery step where you go from "I want to sell toys" to a list of 1,000+ specific ASINs to evaluate.

Monitoring Category Changes

Product categories shift. New products enter, old ones go out of stock, pricing changes weekly. Setting up a recurring scan catches new opportunities:

Python
def detect_new_asins(categories, known_asins_file, api_key):
    """Find newly appeared ASINs in target categories."""
    # Load previously known ASINs
    with open(known_asins_file) as f:
        known = set(json.load(f))

    current = bulk_scan_categories(categories)
    current_asins = {item["asin"] for item in current}

    new_asins = current_asins - known
    removed_asins = known - current_asins

    print(f"New ASINs: {len(new_asins)}")
    print(f"Removed ASINs: {len(removed_asins)}")

    # Save updated known set
    with open(known_asins_file, "w") as f:
        json.dump(list(current_asins), f)

    return [item for item in current if item["asin"] in new_asins]

Running this weekly costs $2-4/month depending on category count and gives you a continuous pipeline of new ASIN candidates to evaluate.

Limitations

Search API results do not cover every product in a category. Amazon has millions of products per category; a search query returns the top 20-50 most relevant results. This is fine for finding competitive, high-visibility products but will miss long-tail items with low search visibility.

For exhaustive category coverage, you would need Amazon's Product Advertising API (PA-API), which requires an Associates account and has its own rate limits (1 request/second). The search API approach is better for rapid discovery and filtering; PA-API is better for comprehensive catalog coverage.