Automatically collects AI news from RSS, Reddit, and Hacker News, analyzes with Claude, and delivers a ranked daily digest to Discord and Slack.
The News Digest AI Agent continuously ingests AI news from ten configurable sources and deduplicates items. It extracts full article text, analyzes content with Claude, and assigns an importance score. It then delivers a polished, ranked digest to Discord and Slack, reducing clutter and speeding up decision making.
End-to-end workflow from gathering sources to delivering a formatted briefing.
Ingest feeds from RSS, Atom, Reddit JSON, and Hacker News API
Deduplicate articles by URL hash and fuzzy title
Extract full article text with a text reader
Score and categorize each item with Claude Haiku
Compile the digest into lead story, top stories, and quick hits
Deliver the formatted digest to Discord and Slack
This AI agent replaces fragmented manual work with a predictable execution flow.
A simple 3-step flow that non-technical users can follow.
Daily pull from RSS, Atom, Reddit JSON, and Hacker News API, with URL hash and fuzzy title deduplication to avoid repeats.
Jina Reader extracts full text; Claude Haiku assigns a 1-10 importance score and categories; Claude Sonnet contextualizes why it matters.
Claude Sonnet formats a lead story, top stories, and quick hits; the digest is sent to Discord and Slack with structured formatting.
A realistic daily scenario that shows time and outcome.
Scenario: A product marketing team wants a daily AI news digest focusing on policy, enterprise AI deployments, and emerging tools. At 7:30 AM, the agent ingests from ten configured sources, deduplicates, and extracts text. It scores and ranks articles, then compiles a digest with a lead story, top stories, and quick hits. The digest is delivered to the team’s Discord and Slack channels within minutes, enabling rapid planning and outreach.
Who should consider this AI agent and why they would benefit.
Needs timely, curated AI news to inform roadmaps and risk assessments.
Wants a steady stream of AI announcements to inform campaigns and partnerships.
Requires focused updates on AI techniques, papers, and implementations.
Wants to reduce newsletter clutter and ensure cross-day continuity.
Needs trend insights and impact signals to steer planning.
Prefer programmatic access to a digest that informs build decisions.
Tools the agent uses to gather, analyze, and deliver the digest.
Scores articles for importance and assigns categories.
Compiles the top articles into a structured digest.
Extracts full article text for deeper analysis.
Delivers the digest as rich embeds to channels.
Delivers the digest using Block Kit formatting.
Stores history and enables cross-day deduplication when enabled.
Six practical scenarios where the agent adds concrete value.
Common questions and detailed answers about setup and usage.
The agent monitors RSS and Atom feeds, Reddit JSON, and the Hacker News API, with a configurable list of up to ten sources. It fetches new items on a daily schedule and deduplicates items by URL hash and fuzzy title matching to avoid repeats. You can customize which feeds matter most by editing the feed list. The ingestion step is designed to be robust against feed outages and format changes. The overall goal is to ensure a broad but relevant coverage of AI news.
Estimated cost per daily run ranges from a few cents to around a tenth of a dollar when using Claude Haiku and Sonnet. The exact price depends on the number of articles processed and the Claude usage tier. This keeps the digest affordable for everyday use while still delivering depth. You can monitor usage in your Anthropic account and adjust sources to control volume.
No database is required to run the digest out of the box. The workflow operates with in-memory processing and file-based storage. If you enable PostgreSQL, you can preserve article history and enable cross-day deduplication. Optional history storage helps you track what was covered across days and analyze long-term trends. If you skip it, the digest remains stateless but fresh each day.
Yes. You can edit the list of feed sources to match your interests and adjust topic importance weights to prioritize what matters. The system allows you to downweight noisier sources and upweight reputable publishers. Changes take effect on the next run, letting you fine-tune relevance over time. This makes the digest more aligned with your goals without changing the underlying architecture.
Deduplication uses a combination of a URL hash and fuzzy title matching to identify the same article across runs. This prevents repeats in the digest and supports cross-day continuity if you enable history storage. The dedup logic is designed to handle minor title differences and canonical URL redirections. If a near-duplicate is detected, the system can still surface the most representative version of the article.
The digest is delivered to Discord via rich embeds and to Slack via Block Kit. Formatting is designed to emphasize the lead story, top stories, and quick hits, with a consistent layout across platforms. You can tweak the tone and structure through Claude prompts to match your brand. Both channels support quick actions and context, enabling faster decision making.
A basic level of setup is enough: provide your Anthropic API key, configure your Discord webhook and Slack credentials, and edit your feed sources. The workflow is designed to be usable by non-technical teams, with configuration steps and prompts that are straightforward. If you want to store history, you can enable a PostgreSQL database and adjust connection settings. Regular maintenance is minimal, primarily revolving around source updates and credential management.
Automatically collects AI news from RSS, Reddit, and Hacker News, analyzes with Claude, and delivers a ranked daily digest to Discord and Slack.