
TL;DR:
- Build a working RSS to WordPress pipeline with Make.com + AI (Claude + ChatGPT), powered by Google News—no coding needed.
- The flow scrapes news articles, rewrites with AI, generates images, and publishes every 180 minutes on autopilot.
- Use a Google Sheets database to prevent duplicates fast; upgrade later to Make Data Stores if you scale.
- Favor a janky MVP that ships over waiting for a perfect system—momentum beats polish.
I built an auto-blogging system that publishes AI-written news articles to my WordPress site while I sleep. It’s not pretty, I forgot what half the modules do, and I even fixed a bug on camera—but it just works and has been shipping posts for weeks.
If you’ve been wanting to set up your own automated news blog with Make.com and AI but keep stalling for perfection, take this as your permission to start ugly and improve later.
Discover how to automate your blog’s news feed in this engaging tutorial.
What This Thing Actually Looks Like in the Wild
On my site there’s a niche news section (skateboarding) constantly updating with short, punchy posts—featured image plus a few tight paragraphs—and they look pretty solid without heavy manual edits.

These aren’t long-form essays; they’re concise updates that keep the site fresh for users and Google, and I don’t have to write them three times a day.
“It’s very low optimized—but it just works.”
MVP mindset
Step 1: Getting Your Google News RSS Feed
The flow starts with an RSS feed: search your topic on Google News, grab that URL, and turn it into a clean RSS you can drop into Make.com.
I used RSS.app at first—paste the Google News URL, wait a minute, get an RSS link—slow for some reason, but it does the job for an MVP.
Google News has a free native RSS: https://news.google.com/rss/search?q=YOUR+KEYWORD —use it to skip third-party dependencies entirely.
In Make.com’s RSS module, I set the feed to return one item per run so every execution processes a single fresh article.
I schedule the scenario every 180 minutes—a blunt anti-duplication guard that’s good enough for MVPs until you add proper de-dupe logic.
Step 2: The Google Sheets De-Duplication Hack
After fetching an item, the flow logs it in Google Sheets, which I use as a simple de-dupe database and audit trail.
A filter checks if the URL already exists in column A; if it does, stop; if not, continue—a straightforward lookup that prevents reposts.
Make’s Data Stores are faster and cheaper than Sheets for de-duplication at scale—consider switching once volume grows.
Yes, there’s an Array Aggregator in my build I barely remember; I left it in because the pipeline still works and shipping beats refactoring.
Step 3: Scraping the Full Article Content
Once an item passes de-dupe, a web scraper pulls the readable article text from the source URL for AI processing.
I use a custom scraper from Hassan (“Automate with Hassan” on YouTube)—installable in Make with an API key—and it consistently extracts content.
The scraped body is stored alongside the source URL in Sheets so I can audit inputs vs. outputs and trace any oddities.
Step 4: AI Content Generation with Anthropic Claude and ChatGPT
Claude handles the heavy rewrite because I find it higher-quality for news style and it reliably follows nuanced instructions.
I chain multiple AI steps instead of one giant prompt; this prompt chaining approach improves accuracy, tone, and structure.
Then I offload titles, slugs, and excerpts to GPT-3.5 to save money on easy tasks while keeping Claude for the creative heavy lifting.
| Task | AI Model Used | Why |
|---|---|---|
| Article body (rewrite + tone) | Anthropic Claude | Better content quality; follows complex instructions |
| Post title | ChatGPT (GPT-3.5) | Cheaper for short outputs |
| URL slug | ChatGPT (GPT-3.5) | Fast and inexpensive |
| Post excerpt | ChatGPT (GPT-3.5) | Lightweight text generation |
| Featured image | DALL-E | Unique image per article |
For context, GPT-3.5-turbo token rates are very low, so titles and slugs cost pennies per month even at steady volume.
Step 5: Publishing to WordPress (Plus a Live Bug Fix)
After AI completes, I have variables for body, title, slug, excerpt, and a DALL-E image that I upload as media before creating the post.
WordPress fields map to different module outputs named “result”, so double-check the source of each mapped value to avoid mixups.
Post creation is followed by a “Get Post” step, then I update Sheets with the editor URL and live link; during filming I fixed a field that was pulling the wrong variable and saved the scenario.
The last step emails me a confirmation so I know a new article shipped without opening Make.
The Honest Truth About This System’s Flaws
This MVP is under-optimized: loose filters, a dead-end router, uncertain Markdown parsing, and a coarse 180-minute duplicate guard.
AI can hallucinate facts; I keep source text in Sheets to compare outputs because even good models can be wrong 3–5% of the time on certain tasks.
Do not publish without editorial review; your brand owns the errors, not the AI.
FAQ
Frequently Asked Questions
Expect the Make Core plan plus light API usage; many setups run for $15–$30/mo depending on volume and whether you use RSS.app.
Absolutely—it works for any reliable feed: product launches, job posts, competitor updates; just tune your prompts to the content type.
Use Make’s error handlers, retries, and routes; in an MVP, expect failures and add notification emails to catch them early.
Nope—it’s visual in Make; if you can write prompts and configure modules, you can ship this without code.
Google evaluates content quality, not author; useful, original summaries with citations and review are fine.
Final Thoughts
I’m not a Make pro—I just shipped an MVP that publishes real posts and improves a bit each week.
Start with something imperfect, learn from real runs, and then iterate; momentum and feedback are your best optimizers.
Sources and References
Sources and References



















