End-to-end automation from user input to inbox delivery.
This AI Agent collects user preferences from a form, scrapes MediaMarkt deals, and generates personalized deal recommendations. It creates a polished HTML email and delivers it to each subscriber's inbox. All steps run end-to-end automatically on self-hosted components, ensuring privacy and reproducible results.
Automates input capture, data extraction, and delivery of tailored deals.
Collects user preferences via a form (categories + email).
Scrapes MediaMarkt deals using Bright Data and returns HTML content.
Parses HTML to extract product name, price, and link.
Uses GPT-4o-mini to rank and tailor recommendations based on user preferences.
Formats results into a clean HTML email via Document Generator.
Sends the personalized deals via SMTP to the user.
This AI Agent replaces manual deal hunting and scattered data handling with an automated, end-to-end flow. It directly reduces time spent on curation and ensures subscribers receive up-to-date, relevant offers.
A simple 3-step flow that non-technical users can follow.
A user submits categories and email; this triggers the AI Agent to start the flow.
Bright Data scrapes the offers page and returns HTML content, which is parsed for product name, price, and links.
GPT-4o-mini generates ranked recommendations, Document Generator formats them into HTML, and SMTP delivers the email.
One realistic scenario illustrating timing and outcome.
A user selects electronics categories (laptops and smartphones) and provides their email. The AI Agent scrapes current deals, ranks the top 5 matching the preferences, assembles a branded HTML email, and sends it to the user within about 20 minutes.
One supporting sentence.
Need to regularly share relevant deals with customers.
Want to automate personalized campaigns at scale.
Need to drive clicks with targeted offers.
Aim to align offers with customer profiles.
Want to test deal attractiveness across segments.
Prefer receiving tailored deal emails instead of manual searches.
One supporting sentence with outline of integrations.
Scrapes MediaMarkt deals via proxy-based scraping and returns HTML content.
Generates a templated HTML email from deal data.
Filters, ranks, and enhances deals based on user preferences.
Delivers the final email to the subscriber.
One supporting sentence with short explanation.
One supporting sentence with practical concerns.
Yes. It requires a self-hosted environment with the Bright Data and Document Generator community nodes. The AI Agent orchestrates the end-to-end flow from form capture to email delivery. You must provide credentials for the Bright Data proxy, the OpenAI service, and your SMTP server. While it runs locally, you should monitor usage, data handling, and costs. It’s important to ensure compatibility with your hosting stack and updates to the contributed nodes.
Yes. The AI Agent’s data source is configurable. You can point it to other deals pages or catalogs, provided you can extract structured deal data. You may need to adjust the parsing and template logic to accommodate different page structures. Ensure you have the rights to scrape and comply with site policies. In all cases, the flow remains the same: capture preferences, fetch data, generate a formatted email, and deliver.
The community nodes used here (Bright Data and Document Generator) are not available in n8n Cloud. You must host n8n yourself to run this AI Agent. If you rely on cloud hosting, you’ll need alternative scraping and document-generation methods that are allowed there. The core logic, however, remains the same: capture, fetch, generate, and send.
Setup time depends on your familiarity with hosting and the required nodes. Expect a few hours for a first-time install: install community nodes, configure credentials, build the form, and test end-to-end. Once configured, ongoing maintenance is minimal, mainly updating templates and sources. Documentation, if available, helps reduce setup time further.
Data protection is the responsibility of the hosting environment. Since the flow runs on self-hosted infrastructure, you control where data is stored and who can access it. Use secured connections for form submission, scraping, AI processing, and email delivery. Implement access control, logging, and data retention policies to meet compliance needs.
Yes. You can add scheduling to run the AI Agent daily or weekly using a cron-like trigger. This enables automated batch emails or seasonal digests. You can also combine user-triggered emails with scheduled campaigns. Ensure you monitor rate limits for scrapers and API usage to avoid interruptions.
If the source page structure changes, you can adjust the parsing logic in the AI Agent. The system is designed to be modular so you can update selectors and templates without rebuilding the entire flow. Regular maintenance of parsing rules and template mapping keeps the delivery accurate. Consider implementing fallback selectors and validation checks to minimize breakage.
End-to-end automation from user input to inbox delivery.