
We automated our client's news page to boost their SEO and then built a SaaS around it, and here's how we did it
Hi, this is Denis from Snoika, and today I want to share our smart approach to improving a client’s SEO and GEO metrics without drawing the ire of search engine bots due to overly frequent or keyword-stuffed posts.
What this workflow is for
I built this workflow as part of a content strategy to boost SEO for one of our clients. Search engines reward websites that publish fresh, high-quality content consistently, but producing that content manually every day is time-consuming and expensive.
This automation solves this problem by acting as a AI-powered newsroom. It monitors financial RSS feeds around the clock, selects the most impactful stories, rewrites them as original articles in a professional editorial voice but also adds new insights and thoughts, and delivers them to you for direct publishing. Running three times a day, it keeps your website fed with timely market content without any manual effort.
How it works
- Trigger: The workflow fires automatically at 08:00, 13:00, and 18:00 every day. You can adjust these times to match your publishing schedule or timezone.
- RSS: Four RSS feeds are read simultaneously - Forex News, Stock Market, Economics, and Financial Advisor. These cover a broad range of financial topics so the AI always has diverse stories to choose from. You can swap these for any RSS feeds relevant to your niche.
- Merge: All four feeds are merged into a single stream of articles before filtering begins.
- Filter: Only articles published in the last 4 hours are kept. This ensures the workflow only works with fresh, timely content and never resurfaces old stories.
- Duplicates: Then, articles with identical titles are removed. This handles cases where the same story appears across multiple feeds.
- Memory Reader: Before picking articles, the workflow checks its memory of topics already covered today. This prevents the AI from selecting stories too similar to what was already published earlier in the day.
- News Picker: AI agent reviews all remaining articles and selects the 2 best candidates based on market impact, presence of specific data points, and topic diversity. Promotional content, press releases, and video-only articles are filtered out automatically.
- Memory Recorder: The titles and article snippets of the selected articles are saved to workflow memory so future runs within the same day know what topics to avoid.
- Split Out: The 2 selected articles are split into individual items and processed one by one through a loop.
- Extract News: Jina AI scrapes the full article content from the original URL, giving the rewriter the complete text rather than just the RSS snippet.
- News Reporter: AI agent rewrites the scraped article as a 400-600 word financial news piece. It adds market context, keeps a strong editorial voice, structures the article with a hook and outlook, and adds new thoughts and insights to make the article original.
- Publishing: The finished article is published in a Telegram channel. You can change this to a webhook to publish directly on your website.
- Memory Cleaner: After every 3 runs (one full day), the topic memory is cleared automatically so the next day starts fresh.
What you need to get started
- Google Gemini API key
- Anthropic API key (Claude)
- Jina AI API key (FREE)
- Telegram bot token (FREE)
- Your own RSS feed URLs from RSSapp or any RSS source
- A website with a news section to publish the output
- You can also add the fact-checking step
What your website needs to make this SEO-effective
Publishing AI-written articles is only half the job. To get real SEO value from this content, your website needs the right technical foundation:
- News schema markup: Add
NewsArticlestructured data (JSON-LD) to every article page. This tells Google the content is news, enables rich results in Google News and Discover, and increases click-through rates significantly. At minimum include:headline,datePublished,dateModified,author,publisher, andimage. - A dedicated news page: Google needs a clear, crawlable archive of your articles. A paginated
/newssection with proper internal linking between articles helps Google understand your site publishes content regularly. - Google News sitemap: A separate
sitemap-news.xmlthat lists only articles published in the last 48 hours. This is required for Google News indexing and tells Google’s news crawler exactly where to look. Standard XML sitemaps are not enough on their own. - Fast page load speed: News content is time-sensitive. Google deprioritizes slow pages in news results. Aim for under 2 seconds load time - use a CDN, compress images, and avoid render-blocking scripts on article pages.
- SEO-friendly URL slugs: Every article page should have a clean, keyword-rich URL slug rather than a generic ID or timestamp. For example
/news/fed-raises-interest-rates-2026instead of/news/?p=1234. This helps Google understand what the page is about before even reading it, improves click-through rates in search results, and makes internal linking easier as your archive grows. - Author and E-E-A-T signals: Google’s quality guidelines reward Expertise, Experience, Authoritativeness, and Trustworthiness - especially for financial content which falls under YMYL (Your Money Your Life). Add a real or clearly defined author profile, an About page explaining your editorial process, and links to your social or professional profiles.
- Open Graph and Twitter Card meta tags: These don’t directly affect rankings but they control how your articles look when shared on social media, which drives traffic back to your site and builds indirect SEO authority through backlinks.
Hope it was useful, wishing y'all good luck!