A Go CLI tool that periodically scrapes Reddit, Hacker News, and Twitter/X for posts relevant to your side projects, then sends you an email digest of matches. Uses keyword pre-filtering + Claude API for smart relevance scoring.
- Fetch — Pulls recent posts from Reddit, Hacker News, and Twitter using their APIs
- Deduplicate — Checks a local SQLite database to skip posts you've already seen
- Keyword filter — Case-insensitive substring match against your configured keywords to narrow down candidates
- LLM scoring — Sends keyword-matched posts to Claude (Haiku) to score relevance 1-10 against your product description
- Notify — Posts scoring 6+ are compiled into an HTML email digest and sent via SMTP
post-radar/
├── main.go # CLI entrypoint (cobra): scan and digest commands
├── config.go # Config loading/validation from YAML
├── config.yaml # User config (products, API keys, email settings)
├── sources/
│ ├── source.go # Post struct + Source interface
│ ├── reddit.go # Reddit API client (OAuth2 app-only flow)
│ ├── hackernews.go # HN Algolia API client (free, no auth)
│ └── twitter.go # Twitter/X API v2 client (bearer token)
├── matcher/
│ ├── keyword.go # Keyword pre-filter
│ └── llm.go # Claude API relevance scoring (concurrent, batched)
├── notify/
│ ├── email.go # SMTP email digest sender
│ └── templates/
│ └── digest.html # Embedded HTML email template
└── store/
└── seen.go # SQLite store for seen post IDs (~/.post-radar/seen.db)
go build -o post-radar .Edit config.yaml with your API credentials:
products:
- name: "MyApp"
description: "A tool that does X for Y audience"
keywords: ["my-app", "alternative to Z", "looking for X"]
subreddits: ["selfhosted", "SideProject", "webdev"]
sources:
reddit:
client_id: "..." # from https://www.reddit.com/prefs/apps
client_secret: "..."
twitter:
bearer_token: "..." # from Twitter Developer Portal
# HN needs no auth
anthropic:
api_key: "..." # from https://console.anthropic.com
email:
smtp_host: "smtp.gmail.com"
smtp_port: 587
username: "[email protected]"
password: "..." # Gmail app password
from: "[email protected]"
to: "[email protected]"
schedule:
interval: "24h" # lookback window for fetching postsReddit and Twitter sources are optional — if credentials are left blank, those sources are skipped. HN always runs (no auth needed).
# Dry run: fetch, filter, score, print results to stdout
./post-radar scan --dry-run
# Full run: fetch, filter, score, send email digest
./post-radar scan
# Alias for --dry-run
./post-radar digestUse --config / -c to point to a config file in a different location:
./post-radar scan -c ~/my-config.yaml# 1. Clone the repo
git clone https://github.com/yourusername/post-radar.git
cd post-radar
# 2. Install Go (if needed)
# https://go.dev/dl/ — requires Go 1.21+
# 3. Copy the example config and fill in your credentials
cp config.example.yaml config.yaml
# 4. Build the binary
go build -o post-radar .
# 5. Do a dry run to verify everything is wired up (no email sent)
./post-radar scan --dry-run
# 6. Run for real
./post-radar scanThe first run will create ~/.post-radar/seen.db automatically. Subsequent runs skip posts already processed.
Seen post IDs are stored in ~/.post-radar/seen.db (SQLite). Running scan twice won't process the same posts again. Entries older than 30 days are automatically pruned.
- cobra — CLI framework
- yaml.v3 — Config parsing
- anthropic-sdk-go — Claude API client
- modernc.org/sqlite — Pure-Go SQLite driver (no CGO)