GDELT CLI: Global News Intelligence From Your Terminal
There’s a dataset out there that monitors nearly every news broadcast, print article, and online source across the planet — covering every country, in over 100 languages, updated every 15 minutes. It’s called GDELT (Global Database of Events, Language, and Tone), and it’s been quietly running since 2013, cataloging the world’s events in a structured, queryable format.
The problem? Actually using GDELT has always been a pain. The API is there, but it’s got rate limits, awkward response formats, and no good way to do exploratory analysis without building your own pipeline. So I built gdelt-cli — a Rust CLI that puts all of this at your fingertips.
What You Can Actually Do With It
The basics first. You search global news:
gdelt doc search "semiconductor export controls" --timespan 7d --country:US
That gives you articles, with filtering by country, language, timespan, and tone (yes, GDELT scores sentiment). But it goes further:
# Timeline of coverage volume for a topic
gdelt doc timeline "Ukraine grain deal" --resolution day
# Geographic heatmap of where events are happening
gdelt geo search "earthquake" --format heatmap
# Search TV news broadcasts
gdelt tv search "central bank interest rate"
Every command spits out JSON by default when piped, human-readable tables when you’re just poking around interactively. That’s a small thing that makes a big difference in practice.
The Local Analytics Angle
This is where it gets interesting. GDELT publishes bulk data files — events coded with the CAMEO taxonomy (20 categories spanning from “diplomatic cooperation” to “mass violence”, each with a Goldstein score from -10 to +7), plus a Global Knowledge Graph of entities, themes, and relationships.
gdelt-cli downloads these and loads them into a local DuckDB database. Once you’ve synced, you can query without touching the API at all:
# Sync the latest data
gdelt data sync
# Query the local event database
gdelt events query --country India --event-type 14 --after 2026-01-01
# Entity extraction and trend analysis
gdelt analytics trends --topic "artificial intelligence" --days 30
gdelt analytics sentiment --country Brazil --days 7
No rate limits. No network latency on repeat queries. Just DuckDB doing what DuckDB does best — chewing through analytical queries on columnar data stupidly fast.
Built for Agents
Here’s the thing I’m most excited about. The whole CLI was designed agent-first. Not “we added a JSON flag” agent-first — actually thought-through for machine consumption:
- Structured exit codes: 0 for success, 2 for validation errors, 3 for network issues, 5 for rate limiting. Your automation can branch on these without parsing error messages.
--help-json: Dumps the complete command schema as JSON. An agent can introspect every available command programmatically.gdelt schema <command>: Machine-readable schema for any specific command.- JSONL output: For streaming large result sets without buffering everything in memory.
And then there’s the MCP server. Run gdelt serve and it exposes GDELT as a set of tools that any MCP-compatible AI assistant can call:
{
"mcpServers": {
"gdelt": {
"command": "gdelt",
"args": ["serve"]
}
}
}
Drop that into your Claude Desktop config and suddenly your AI assistant can search global news, pull event timelines, run geographic queries, and analyze sentiment trends — all autonomously. The exposed tools are gdelt_search, gdelt_timeline, gdelt_geo, gdelt_events_query, gdelt_gkg_query, and gdelt_analytics.
The Briefing Generator
There’s a neat script bundled in called gdelt-briefing.sh that ties it all together. Point it at a country and it uses the CLI to pull recent events, then feeds everything to Claude Code to generate a diplomatic intelligence briefing:
./gdelt-briefing.sh India Delhi
Out comes a structured briefing saved to briefings/India/ — the kind of thing that would take a human analyst hours to compile from scattered sources.
Under the Hood
It’s Rust, so the binary is fast and self-contained. The stack:
- Tokio for async I/O (all API calls are non-blocking)
- DuckDB for local analytical queries
- SQLite for API response caching (smart TTL-based, so stale data gets refreshed)
- Clap for CLI argument parsing
- Serde for serialization across JSON/JSONL/CSV formats
There’s also a daemon mode (gdelt daemon start --sync --mcp) that runs in the background, continuously syncing fresh data and serving the MCP interface — useful if you want always-current local data without manually running sync.
Getting Started
Install is straightforward:
# One-liner (downloads binary or builds from source)
curl -sSL https://raw.githubusercontent.com/dipankar/gdelt-cli/main/install.sh | bash
# Or via cargo
cargo install --git https://github.com/dipankar/gdelt-cli
# Or build locally
git clone https://github.com/dipankar/gdelt-cli
cd gdelt-cli
cargo build --release
No API keys needed. GDELT is open and free. Configuration lives at ~/.config/gdelt/config.toml if you want to tweak defaults, cache sizes, or database memory allocation.
Why This Exists
I’ve been interested in global event data for a while — the kind of structured, real-time feed that lets you see patterns before they become headlines. GDELT is the best open dataset for this, but the tooling around it has always lagged behind the data itself. Everything was either “write your own BigQuery SQL” or “use this janky Python wrapper that hasn’t been updated since 2019.”
The agent angle came naturally. If you’re building AI workflows that need to be aware of what’s happening in the world — not just what’s in the training data, but right now — GDELT through MCP is a compelling primitive. Your agent can check today’s news the way it checks today’s weather.
The project is MIT licensed and on GitHub. Contributions welcome.
Frequently Asked Questions
Related posts
- Apollo.io CLI: Sales Intelligence Meets the Terminal — Apr 2026
- Compiling Firefox OS (B2G) on Ubuntu 64-bit: A Developer's Journey — Aug 2012
- Building a Real-Time Twitter Feed Wall: A DIY Project for Event Displays — Jan 2009
- Firefox OS: A Visionary Leap Towards a Web-Centric Mobile Future — Mar 2013
- Geeksphone Keon: Unboxing and First Impressions of Firefox OS — May 2013