Overview
Job seekers waste hours daily checking the same job boards. This n8n workflow runs daily, searches Google for job listings matching your criteria via Scavio, deduplicates against previous results, and sends only new job matches to Slack or email. Zero code, 250 free credits/month covers the searches.
Trigger
Daily at 7 AM via n8n cron trigger.
Schedule
Daily
Workflow Steps
Trigger on Schedule
n8n Cron node fires daily at 7 AM. Configurable timezone.
Search for Jobs via Scavio HTTP Node
HTTP Request node calls Scavio search API with job-related queries. Returns structured results.
Filter Job-Related Results
Function node filters results to keep only those with job-related keywords in the title.
Deduplicate Against Previous Runs
Compare URLs against a stored list of previously seen jobs. Keep only new listings.
Send New Jobs to Slack or Email
Format new job listings and send via Slack webhook or email node.
Python Implementation
import requests, os, json
from pathlib import Path
API_KEY = os.environ["SCAVIO_API_KEY"]
H = {"x-api-key": API_KEY, "Content-Type": "application/json"}
SEEN_FILE = Path("seen_jobs.json")
JOB_QUERIES = [
"python developer remote jobs 2026",
"senior backend engineer remote",
"machine learning engineer hiring 2026",
]
def search_jobs(query: str) -> list:
resp = requests.post(
"https://api.scavio.dev/api/v1/search",
headers=H,
json={"query": query, "country_code": "us"},
timeout=10,
)
results = resp.json().get("organic_results", [])
job_keywords = ["job", "hiring", "career", "position", "apply", "opening", "vacancy"]
return [{"title": r.get("title", ""), "url": r.get("link", ""), "snippet": r.get("snippet", "")}
for r in results if any(kw in r.get("title", "").lower() for kw in job_keywords)]
def daily_job_search():
seen = set(json.loads(SEEN_FILE.read_text())) if SEEN_FILE.exists() else set()
new_jobs = []
for q in JOB_QUERIES:
jobs = search_jobs(q)
for j in jobs:
if j["url"] not in seen:
new_jobs.append(j)
seen.add(j["url"])
SEEN_FILE.write_text(json.dumps(list(seen)))
return new_jobs
new_jobs = daily_job_search()
print(f"Found {len(new_jobs)} new job listings")
for j in new_jobs[:5]:
print(f" {j['title']}: {j['url']}")JavaScript Implementation
const H = {'x-api-key': process.env.SCAVIO_API_KEY, 'Content-Type': 'application/json'};
const fs = await import('fs');
const JOB_QUERIES = ['python developer remote jobs 2026', 'senior backend engineer remote', 'machine learning engineer hiring 2026'];
async function searchJobs(query) {
const r = await fetch('https://api.scavio.dev/api/v1/search', {method:'POST', headers:H, body:JSON.stringify({query, country_code:'us'})});
const results = (await r.json()).organic_results || [];
const kws = ['job','hiring','career','position','apply','opening','vacancy'];
return results.filter(r=>kws.some(k=>(r.title||'').toLowerCase().includes(k))).map(r=>({title:r.title, url:r.link, snippet:r.snippet}));
}
async function dailyJobSearch() {
let seen = new Set();
try { seen = new Set(JSON.parse(fs.readFileSync('seen_jobs.json','utf8'))); } catch {}
const newJobs = [];
for (const q of JOB_QUERIES) {
for (const j of await searchJobs(q)) {
if (!seen.has(j.url)) { newJobs.push(j); seen.add(j.url); }
}
}
fs.writeFileSync('seen_jobs.json', JSON.stringify([...seen]));
return newJobs;
}
const newJobs = await dailyJobSearch();
console.log('Found '+newJobs.length+' new job listings');
for (const j of newJobs.slice(0,5)) console.log(' '+j.title+': '+j.url);Platforms Used
Web search with knowledge graph, PAA, and AI overviews