Daily, automated curation flow: ingest from multiple sources, filter by recency and keywords, dedupe with Gemini AI, publish to Notion, and alert via Telegram.
The Daily News Radar AI agent automatically collects updates from Blog, Community, GitHub Releases, and Reddit, then filters to the last 24 hours and relevance. It uses Gemini AI to deduplicate topics and assign a score, ensuring only high-signal items pass. Finally, it saves curated topics to Notion and notifies a Telegram channel when a top-score item is detected.
A concise, action-focused overview of the tasks this AI agent performs.
Ingests data from Blog, Community, GitHub Releases, and Reddit via RSS and the GitHub API.
Normalizes timestamps and filters to the last 24 hours.
Dedupe topics semantically and score with Gemini Editor-in-Chief (Score 3+ passes).
Saves curated topics to a Notion database with proper chunking.
Sends urgent alerts to Telegram for items with Score >= 4.
Supports daily scheduling or manual run triggers to control the workflow.
The Daily News Radar AI agent streamlines the end-to-end daily news curation, removing manual bottlenecks while preserving signal quality.
A simple, 3-step system anyone can follow.
A daily scheduled or manual trigger starts ingestion from Blog, Community, GitHub Releases, and Reddit via RSS and the GitHub API.
Normalize timestamps, filter to the last 24 hours, and apply a keyword pre-filter to reduce noise and LLM costs.
Batch items are sent to Gemini AI for deduplication and scoring; valid items are chunked safely, saved to Notion, and high-scoring items trigger Telegram alerts.
One realistic scenario.
Scenario: 9:00 AM daily run ingests 28 items from four sources. The Gemini Editor-in-Chief deduplicates to 12 unique topics and scores them; 4 topics pass Score 4+. These are saved to Notion and a Telegram alert is sent for 2 urgent topics to the channel.
One supporting sentence.
Need a concise daily digest to inform campaigns and move fast on trends.
Want to monitor related features and market signals to guide roadmaps.
Require a centralized source of validated topics for newsletters.
Need timely signals to craft experiments and messaging.
Prefer to maintain the curation backlog in Notion for collaboration.
Want automated briefs integrated with collaboration channels.
One supporting sentence with short explanation.
Dedupe and score topics with Editor-in-Chief; apply Score 3+ threshold.
Store curated topics into a Notion database with proper chunking and properties.
Dispatch urgent alerts to a chat or channel when items meet high-score criteria.
Ingest updates from Blog, Community, Reddit.
Fetch latest releases relevant to the n8n ecosystem.
One supporting sentence with short explanation.
One supporting sentence with short explanation.
The AI agent ingests updates from four primary sources: Blog, Community, GitHub Releases, and Reddit, using RSS feeds and the GitHub API where available. The ingestion runs on a daily schedule or can be triggered manually for testing. Data from these sources is normalized and timestamped to enable reliable 24-hour filtering. You can customize which sources are included or excluded based on your needs, and you can adjust the frequency of the daily run. Security of source data is maintained through standard authentication methods provided by each service.
Gemini AI analyzes semantic similarity across topics, grouping related items and removing duplicates while preserving unique signals. It assigns a score based on predefined criteria, including relevance, recency, and potential impact. The system uses a conservative threshold (Score 3+) to pass items forward, balancing coverage with quality. You can adjust thresholds to fit your risk tolerance and workload.
Yes. The scoring threshold is configurable to match your organization’s risk appetite and desired signal quality. Lower thresholds increase coverage but may raise noise; higher thresholds reduce noise but may miss subtler signals. Changes apply to subsequent runs and can be tested in a sandbox mode before going live. Documentation and logs show how scores were assigned for traceability.
Notion is the primary destination for curated topics in this workflow, but you can adapt the agent to other databases if needed. Data is saved as Notion pages or database entries with chunking to ensure readable, page-sized content. Each entry includes relevant properties such as title, score, source, and timestamp for easy filtering. If Notion is unavailable, the agent can pause publication until connectivity is restored.
Credentials are stored securely and accessed only by the AI agent during scheduled runs or manual triggers. Data in transit uses encrypted channels, and access is restricted by your platform’s authentication controls. You can configure IP allowlists and least-privilege permissions for each integration (Notion, Telegram, sources). Audit logs are maintained to monitor activity and changes.
Telegram alerts can be customized by chat/channel, notification rules, and minimum score thresholds. You can choose to receive alerts for all 4+ items or only the highest-priority signals. Alerts include a concise topic summary and a link to the corresponding Notion entry when available. You can disable alerts temporarily without stopping the data ingestion pipeline.
Yes. Notion serves as the primary repository for curated topics, but you can export data to CSV or JSON for archival or integration with other tools. The agent preserves structured fields (title, score, source, timestamp) to facilitate clean exports. You can also connect downstream systems to pull data directly from Notion via API. Modifications to the workflow can be performed by administrators with appropriate permissions.
Daily, automated curation flow: ingest from multiple sources, filter by recency and keywords, dedupe with Gemini AI, publish to Notion, and alert via Telegram.