Skip to content

dinghar/post-radar

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

post-radar

A Go CLI tool that periodically scrapes Reddit, Hacker News, and Twitter/X for posts relevant to your side projects, then sends you an email digest of matches. Uses keyword pre-filtering + Claude API for smart relevance scoring.

How it works

  1. Fetch — Pulls recent posts from Reddit, Hacker News, and Twitter using their APIs
  2. Deduplicate — Checks a local SQLite database to skip posts you've already seen
  3. Keyword filter — Case-insensitive substring match against your configured keywords to narrow down candidates
  4. LLM scoring — Sends keyword-matched posts to Claude (Haiku) to score relevance 1-10 against your product description
  5. Notify — Posts scoring 6+ are compiled into an HTML email digest and sent via SMTP

Project structure

post-radar/
├── main.go                    # CLI entrypoint (cobra): scan and digest commands
├── config.go                  # Config loading/validation from YAML
├── config.yaml                # User config (products, API keys, email settings)
├── sources/
│   ├── source.go              # Post struct + Source interface
│   ├── reddit.go              # Reddit API client (OAuth2 app-only flow)
│   ├── hackernews.go          # HN Algolia API client (free, no auth)
│   └── twitter.go             # Twitter/X API v2 client (bearer token)
├── matcher/
│   ├── keyword.go             # Keyword pre-filter
│   └── llm.go                 # Claude API relevance scoring (concurrent, batched)
├── notify/
│   ├── email.go               # SMTP email digest sender
│   └── templates/
│       └── digest.html        # Embedded HTML email template
└── store/
    └── seen.go                # SQLite store for seen post IDs (~/.post-radar/seen.db)

Setup

1. Build

go build -o post-radar .

2. Configure

Edit config.yaml with your API credentials:

products:
  - name: "MyApp"
    description: "A tool that does X for Y audience"
    keywords: ["my-app", "alternative to Z", "looking for X"]
    subreddits: ["selfhosted", "SideProject", "webdev"]

sources:
  reddit:
    client_id: "..."       # from https://www.reddit.com/prefs/apps
    client_secret: "..."
  twitter:
    bearer_token: "..."    # from Twitter Developer Portal
  # HN needs no auth

anthropic:
  api_key: "..."           # from https://console.anthropic.com

email:
  smtp_host: "smtp.gmail.com"
  smtp_port: 587
  username: "[email protected]"
  password: "..."          # Gmail app password
  from: "[email protected]"
  to: "[email protected]"

schedule:
  interval: "24h"          # lookback window for fetching posts

Reddit and Twitter sources are optional — if credentials are left blank, those sources are skipped. HN always runs (no auth needed).

3. Run

# Dry run: fetch, filter, score, print results to stdout
./post-radar scan --dry-run

# Full run: fetch, filter, score, send email digest
./post-radar scan

# Alias for --dry-run
./post-radar digest

Use --config / -c to point to a config file in a different location:

./post-radar scan -c ~/my-config.yaml

Running locally

# 1. Clone the repo
git clone https://github.com/yourusername/post-radar.git
cd post-radar

# 2. Install Go (if needed)
#    https://go.dev/dl/ — requires Go 1.21+

# 3. Copy the example config and fill in your credentials
cp config.example.yaml config.yaml

# 4. Build the binary
go build -o post-radar .

# 5. Do a dry run to verify everything is wired up (no email sent)
./post-radar scan --dry-run

# 6. Run for real
./post-radar scan

The first run will create ~/.post-radar/seen.db automatically. Subsequent runs skip posts already processed.

Deduplication

Seen post IDs are stored in ~/.post-radar/seen.db (SQLite). Running scan twice won't process the same posts again. Entries older than 30 days are automatically pruned.

Dependencies

About

A Go CLI to find forum discussions relevant to a given topic

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors