Market Research · Marketing Analysts

AI Agent for Daily News Radar

Daily, automated curation flow: ingest from multiple sources, filter by recency and keywords, dedupe with Gemini AI, publish to Notion, and alert via Telegram.

How it works
1 Step
Trigger & Ingest
2 Step
Normalize & Filter
3 Step
Deduplicate, Score, Publish
A daily scheduled or manual trigger starts ingestion from Blog, Community, GitHub Releases, and Reddit via RSS and the GitHub API.

Overview

Three sentences about what the AI agent does and its benefits. Directly explain what the agent does end-to-end.

The Daily News Radar AI agent automatically collects updates from Blog, Community, GitHub Releases, and Reddit, then filters to the last 24 hours and relevance. It uses Gemini AI to deduplicate topics and assign a score, ensuring only high-signal items pass. Finally, it saves curated topics to Notion and notifies a Telegram channel when a top-score item is detected.


Capabilities

What Daily News Radar AI does

A concise, action-focused overview of the tasks this AI agent performs.

01

Ingests data from Blog, Community, GitHub Releases, and Reddit via RSS and the GitHub API.

02

Normalizes timestamps and filters to the last 24 hours.

03

Dedupe topics semantically and score with Gemini Editor-in-Chief (Score 3+ passes).

04

Saves curated topics to a Notion database with proper chunking.

05

Sends urgent alerts to Telegram for items with Score >= 4.

06

Supports daily scheduling or manual run triggers to control the workflow.

Why you should use AI Agent for Daily News Radar

The Daily News Radar AI agent streamlines the end-to-end daily news curation, removing manual bottlenecks while preserving signal quality.

Before
Manual curation is time-consuming and inconsistent in coverage.
Sources pile up and require manual filtering to identify relevance.
Duplicates and near-duplicates clutter the digest.
Relevant items may miss due to scattered data sources.
Alerts are delayed or noisy, causing missed opportunities.
After
A daily, focused digest appears in Notion with clear scoring and dedupe.
High-signal items get sent to Telegram immediately for timely action.
Reduced manual effort frees time for analysis and decision-making.
Filter quality improves with 24h recency and keyword pre-filtering.
The workflow scales to many sources without increasing costs.
Process

How it works

A simple, 3-step system anyone can follow.

Step 01

Trigger & Ingest

A daily scheduled or manual trigger starts ingestion from Blog, Community, GitHub Releases, and Reddit via RSS and the GitHub API.

Step 02

Normalize & Filter

Normalize timestamps, filter to the last 24 hours, and apply a keyword pre-filter to reduce noise and LLM costs.

Step 03

Deduplicate, Score, Publish

Batch items are sent to Gemini AI for deduplication and scoring; valid items are chunked safely, saved to Notion, and high-scoring items trigger Telegram alerts.


Example

Example workflow

One realistic scenario.

Scenario: 9:00 AM daily run ingests 28 items from four sources. The Gemini Editor-in-Chief deduplicates to 12 unique topics and scores them; 4 topics pass Score 4+. These are saved to Notion and a Telegram alert is sent for 2 urgent topics to the channel.

Market Research Google GeminiNotionTelegramRSS feeds AI Agent flow

Audience

Who can benefit

One supporting sentence.

✍️ Marketing Analysts

Need a concise daily digest to inform campaigns and move fast on trends.

💼 Product Managers

Want to monitor related features and market signals to guide roadmaps.

🧠 Editors/Content Teams

Require a centralized source of validated topics for newsletters.

Growth Teams

Need timely signals to craft experiments and messaging.

🎯 Notion-using Researchers

Prefer to maintain the curation backlog in Notion for collaboration.

📋 Tech Startups

Want automated briefs integrated with collaboration channels.

Integrations

One supporting sentence with short explanation.

Google Gemini

Dedupe and score topics with Editor-in-Chief; apply Score 3+ threshold.

Notion

Store curated topics into a Notion database with proper chunking and properties.

Telegram

Dispatch urgent alerts to a chat or channel when items meet high-score criteria.

RSS feeds

Ingest updates from Blog, Community, Reddit.

GitHub API

Fetch latest releases relevant to the n8n ecosystem.

Applications

Best use cases

One supporting sentence with short explanation.

Daily industry news digest for product and marketing teams.
Editorial briefs for newsletters or product updates.
Competitive intelligence snapshots for strategic reviews.
Notion-backed curation backlog for teams to collaborate on.
Urgent alerts for high-impact releases or vulnerabilities.
Topic briefs for content calendar planning and experiments.

FAQ

FAQ

One supporting sentence with short explanation.

The AI agent ingests updates from four primary sources: Blog, Community, GitHub Releases, and Reddit, using RSS feeds and the GitHub API where available. The ingestion runs on a daily schedule or can be triggered manually for testing. Data from these sources is normalized and timestamped to enable reliable 24-hour filtering. You can customize which sources are included or excluded based on your needs, and you can adjust the frequency of the daily run. Security of source data is maintained through standard authentication methods provided by each service.

Gemini AI analyzes semantic similarity across topics, grouping related items and removing duplicates while preserving unique signals. It assigns a score based on predefined criteria, including relevance, recency, and potential impact. The system uses a conservative threshold (Score 3+) to pass items forward, balancing coverage with quality. You can adjust thresholds to fit your risk tolerance and workload.

Yes. The scoring threshold is configurable to match your organization’s risk appetite and desired signal quality. Lower thresholds increase coverage but may raise noise; higher thresholds reduce noise but may miss subtler signals. Changes apply to subsequent runs and can be tested in a sandbox mode before going live. Documentation and logs show how scores were assigned for traceability.

Notion is the primary destination for curated topics in this workflow, but you can adapt the agent to other databases if needed. Data is saved as Notion pages or database entries with chunking to ensure readable, page-sized content. Each entry includes relevant properties such as title, score, source, and timestamp for easy filtering. If Notion is unavailable, the agent can pause publication until connectivity is restored.

Credentials are stored securely and accessed only by the AI agent during scheduled runs or manual triggers. Data in transit uses encrypted channels, and access is restricted by your platform’s authentication controls. You can configure IP allowlists and least-privilege permissions for each integration (Notion, Telegram, sources). Audit logs are maintained to monitor activity and changes.

Telegram alerts can be customized by chat/channel, notification rules, and minimum score thresholds. You can choose to receive alerts for all 4+ items or only the highest-priority signals. Alerts include a concise topic summary and a link to the corresponding Notion entry when available. You can disable alerts temporarily without stopping the data ingestion pipeline.

Yes. Notion serves as the primary repository for curated topics, but you can export data to CSV or JSON for archival or integration with other tools. The agent preserves structured fields (title, score, source, timestamp) to facilitate clean exports. You can also connect downstream systems to pull data directly from Notion via API. Modifications to the workflow can be performed by administrators with appropriate permissions.


AI Agent for Daily News Radar

Daily, automated curation flow: ingest from multiple sources, filter by recency and keywords, dedupe with Gemini AI, publish to Notion, and alert via Telegram.

Use this template → Read the docs