Google Jobs contains valuable data — job title, company, location, posted date, and more. Scraping this data directly means dealing with anti-bot detection, CAPTCHAs, IP rotation, and constantly breaking selectors. The Scavio API handles all of that and returns clean, structured JSON from a single POST request.
This tutorial shows you how to scrape Google Jobs using Ruby and the Scavio API. By the end, you will have a working Ruby script that fetches real-time Google Jobs data and parses the results.
Prerequisites
- Ruby installed on your machine
- A Scavio API key (free tier includes 500 credits/month — no credit card required)
Step 1: Install Dependencies
Install net/http to make HTTP requests:
# net/http and json are in Ruby's standard libraryStep 2: Make Your First Google Jobs Search
Send a POST request to the Scavio Google Jobs API endpoint with your query. The API returns structured JSON with job title, company, location, and more.
require "net/http"
require "json"
api_key = "your_scavio_api_key"
uri = URI("https://api.scavio.dev/api/v1/search")
http = Net::HTTP.new(uri.host, uri.port)
http.use_ssl = true
request = Net::HTTP::Post.new(uri)
request["x-api-key"] = api_key
request["Content-Type"] = "application/json"
request.body = { query: query, tbs: "" }.to_json
response = http.request(request)
data = JSON.parse(response.body)
puts JSON.pretty_generate(data)Step 3: Example Response
The API returns structured JSON. Here is an example response for a Google Jobs search:
{
"search_metadata": { "status": "success" },
"jobs_results": [
{
"position": 1,
"title": "Senior AI Engineer",
"company": "Anthropic",
"location": "Remote, US",
"posted_date": "3 days ago",
"apply_link": "https://boards.greenhouse.io/anthropic/jobs/12345"
}
]
}Every field is structured and typed — no HTML parsing, no CSS selectors, no regex extraction. Your Ruby code can access any field directly.
Step 4: Full Working Example
Here is a complete, runnable Ruby script that searches Google Jobs and prints the results:
require "net/http"
require "json"
# Scrape Google Jobs search results using Scavio API.
# Returns structured JSON with job title, company, location, and more.
def search_google_jobs(query)
api_key = ENV["SCAVIO_API_KEY"]
uri = URI("https://api.scavio.dev/api/v1/search")
http = Net::HTTP.new(uri.host, uri.port)
http.use_ssl = true
request = Net::HTTP::Post.new(uri)
request["x-api-key"] = api_key
request["Content-Type"] = "application/json"
request.body = { query: query, tbs: "" }.to_json
response = http.request(request)
raise "API error: #{response.code}" unless response.is_a?(Net::HTTPSuccess)
JSON.parse(response.body)
end
results = search_google_jobs("senior ai engineer remote")
puts JSON.pretty_generate(results)Why Use Scavio Instead of Scraping Google Jobs Directly?
- No proxy management. Direct scraping requires rotating proxies to avoid IP bans. Scavio handles all of this server-side.
- No CAPTCHA solving. Google Jobs aggressively blocks automated requests. Scavio returns clean data every time.
- Structured JSON output. No HTML parsing or CSS selector maintenance. Get typed, consistent data from every request.
- Multi-platform in one API. Search Google, Amazon, YouTube, and Walmart from the same API key with the same authentication pattern.
- Free tier included. 500 credits/month with no credit card required. Each search costs 1 credit.