youtubeseomonitoring

YouTube Impression Decay: Track It Before It's Gone

YouTube videos lose search visibility over time. Track position decay programmatically and know when to update titles.

5 min read

YouTube videos lose search visibility over time. A tutorial that ranked number two for "python fastapi tutorial" six months ago might be on page three today. YouTube does not tell you when this happens in any useful way. Studio Analytics shows total impressions going down, but it does not tell you which search queries you are losing. You can track this yourself with a search API and catch decay before your traffic disappears.

Why YouTube search positions decay

YouTube re-ranks results based on freshness, engagement velocity, and competition. A video that was the best "react hooks tutorial" in January 2026 gets outranked when newer videos appear with better retention curves. Unlike Google web search, YouTube heavily weights recent upload date and 30-day engagement metrics. This means even excellent content has a natural decay curve.

Be honest: position is not impressions

An important caveat. YouTube search position in API results does not perfectly correspond to impressions in YouTube Studio. YouTube personalizes search results based on watch history, location, and device. The API gives you a baseline position for a logged-out, neutral search. Your actual impressions depend on how YouTube personalizes results for each viewer. Track API position as a directional signal, not an exact metric.

Daily position tracking

Search your target keywords daily via the YouTube search API and record where your video appears. Over time, this reveals your decay curve.

Python
import requests, os, json
from datetime import date

H = {'x-api-key': os.environ['SCAVIO_API_KEY']}
URL = 'https://api.scavio.dev/api/v1/search'
TRACKING_FILE = 'youtube_positions.json'

def track_position(keyword: str, video_id: str) -> dict:
    """Track where a specific video ranks for a keyword."""
    resp = requests.post(URL, headers=H,
        json={'platform': 'youtube', 'query': keyword}, timeout=15)
    results = resp.json().get('organic_results', [])

    position = None
    for i, r in enumerate(results):
        if video_id in r.get('link', ''):
            position = i + 1
            break

    return {
        'date': date.today().isoformat(),
        'keyword': keyword,
        'video_id': video_id,
        'position': position,
        'total_results': len(results)
    }

# Track multiple keywords for your video
KEYWORDS = [
    'python fastapi tutorial 2026',
    'fastapi rest api beginner',
    'fastapi vs flask 2026'
]
VIDEO_ID = 'dQw4w9WgXcQ'  # your video ID

for kw in KEYWORDS:
    result = track_position(kw, VIDEO_ID)
    pos = result['position'] or 'not found'
    print(f"{kw}: position {pos}")

Building the decay curve

After a week of daily tracking, you have enough data points to plot a decay curve. This shows you which keywords are stable and which are slipping.

Python
import json
from collections import defaultdict

def load_history(filepath: str) -> list[dict]:
    with open(filepath) as f:
        return json.load(f)

def analyze_decay(history: list[dict]) -> dict:
    """Group by keyword, calculate decay rate."""
    by_keyword = defaultdict(list)
    for entry in history:
        if entry['position'] is not None:
            by_keyword[entry['keyword']].append({
                'date': entry['date'],
                'position': entry['position']
            })

    decay_report = {}
    for kw, positions in by_keyword.items():
        positions.sort(key=lambda x: x['date'])
        if len(positions) >= 2:
            first = positions[0]['position']
            last = positions[-1]['position']
            days = len(positions)
            decay_rate = (last - first) / days  # positive = decaying
            decay_report[kw] = {
                'start_position': first,
                'current_position': last,
                'decay_rate_per_day': round(decay_rate, 2),
                'status': 'decaying' if decay_rate > 0.1 else 'stable'
            }
    return decay_report

When to take action

A decay rate above 0.1 positions per day means your video is steadily slipping. At that rate, a position-3 video hits page two in about two months. Actions to counter decay:

  • Update the video title with current year and trending terms
  • Update the description with fresh keywords
  • Pin a new comment with updated information
  • Create a follow-up video linking back to the original
  • Add chapters if the video lacks them

Automated alerts

Python
import requests, os

H = {'x-api-key': os.environ['SCAVIO_API_KEY']}
URL = 'https://api.scavio.dev/api/v1/search'

def check_and_alert(keyword: str, video_id: str, threshold: int = 5):
    """Alert if video drops below threshold position."""
    resp = requests.post(URL, headers=H,
        json={'platform': 'youtube', 'query': keyword}, timeout=15)
    results = resp.json().get('organic_results', [])

    for i, r in enumerate(results):
        if video_id in r.get('link', ''):
            if i + 1 > threshold:
                print(f"ALERT: '{keyword}' dropped to position {i + 1}")
                # Send Slack/email notification here
            return i + 1

    print(f"ALERT: '{keyword}' -- video not found in results")
    return None

Cost of daily tracking

Tracking 20 keywords daily is 600 API calls per month. Just over the 500 free credit limit on Scavio. At $0.005 per credit, the overage is $0.50/month. For 50 keywords, it is 1,500 calls or $7.50/month. For serious YouTube creators tracking 100+ keywords, the $30/mo plan at 7,000 credits covers it comfortably.

  1. Identify your top 10-20 target keywords per video
  2. Set up daily position tracking via cron
  3. Monitor the decay curve weekly
  4. Set alerts for positions dropping below your threshold
  5. Take action on decaying keywords before they hit page two