Building a Transparent AI Pipeline: 59 Weeks of Automated Political Scoring with Claude API
I've been running an automated AI pipeline for over a year that ingests news articles, clusters them into political events, and scores each event on two independent axes. Here's how it works, what ...

Source: DEV Community
I've been running an automated AI pipeline for over a year that ingests news articles, clusters them into political events, and scores each event on two independent axes. Here's how it works, what I learned, and why I made everything transparent. The Problem Political events have two dimensions that are rarely measured together: How much institutional damage does this cause? (democratic health) How much media attention does it get? (distraction economics) When these are wildly mismatched — high damage, low attention — something important is being missed. I built The Distraction Index to detect these gaps automatically. Architecture Overview News Sources (GDELT + GNews + Google News RSS) ↓ every 4 hours Ingestion Pipeline (/api/ingest) ↓ dedup + store Clustering (Claude Haiku) → group articles into events ↓ Dual-Axis Scoring (Claude Sonnet) → Score A + Score B ↓ Weekly Freeze → immutable snapshot Tech stack: Next.js 16 (App Router), Supabase (PostgreSQL), Claude API, Vercel Why Two Mode