
TL;DR
You can automate RSS → AI → WordPress in Make.com with drag-and-drop modules. It doesn’t have to be pretty to publish reliably.
Google News search pages can become a custom RSS pipeline with RSS.app, which you then watch inside Make.com on a schedule.
Use Anthropic for natural-sounding news, then use ChatGPT for titles, slugs, and excerpts to control API spend without tanking quality.
The system filters duplicates, generates images, publishes to WordPress, and sends you an email when it’s done.
I built an automated AI content manager with Make.com that pulls news, rewrites it with AI, generates images, and posts everything to WordPress while I’m doing literally anything else. 🤙 And I’m gonna be real with you, this thing is not optimized. It’s held together with duct tape and good intentions. But it works, and that’s the whole point. 👊
If you’ve been stuck in tutorial hell waiting to build the “perfect” automation before you ship anything, consider this your permission slip to just build the ugly version. I did. It publishes real articles to my real blog. Every 180 minutes, like clockwork. So let me walk you through exactly how.
Learn how to automate your blog’s news publishing process
What This System Actually Does (And What It Looks Like)
On my skateboarding blog, I have a news section. And right there, new articles related to skateboarding are always coming up. Fresh stuff, from today, that I didn’t manually write or curate. I don’t even look at it.
I personally do edit them, but you don’t necessarily have to because they come out looking pretty solid with an image and just the news article.

Now notice they’re very short. That’s on purpose. These are supposed to be just news articles, not blog posts. Those are totally different. A news piece gives you the what-happened and gets out of the way.
A blog post is where you go deep and share opinions. This system handles the news side so I can spend my energy on the stuff that actually needs my brain.
The whole thing runs on Make.com, which is a visual automation platform where you connect modules together like Lego blocks. No code. You just drag, drop, and connect things, and it runs on a schedule.
Mine fires every 180 minutes, grabs one new article from an RSS feed, runs it through AI, and publishes it to WordPress. That’s the whole concept.
Step 1: Setting Up Your RSS Feed Pipeline
This is where your content actually comes from, so don’t skip this part.
First, go to Google News and search for your topic. In my case, skateboarding. Once you’ve got your search results page, grab that URL.
Then head over to RSS.app and create an account. When it asks what feed you want to make, you paste in that Google News URL.
It’ll take a minute to generate your feed. I don’t know why it takes so long. But it does. Eventually it spits out a custom RSS feed URL, and that’s what you’ll plug into Make.com.
Back in Make.com, your first module is an RSS feed watcher. Paste that URL from RSS.app into the module, and set it to return one item per run. Why only one?
Because every time the automation fires, I want it to process a single article. This keeps things clean and avoids the system trying to chew through ten articles at once and making a mess of everything.
RSS.app offers both an automatic RSS Generator and a manual RSS Builder. The builder lets you hand-pick feed items, which is useful if Google News formatting gets weird or a source has a non-standard layout.
I have the whole scenario set to run every 180 minutes. That’s only so that I can make sure that it’s not copying something and then immediately copying something that it already copied. Is there a smarter way to handle this?
Absolutely. Am I going to fix it right now? No. It works.
Step 2: The Duplicate Filter (AKA My Janky Spreadsheet Check)
After the RSS module grabs an article, the next step is making sure we haven’t already processed it. My method? A Google Sheets spreadsheet.
The automation checks: if the URL of the feed is already in column A, then do not add it. If it’s not there, then continue. That’s it. That’s the whole filter.
It’s just searching the spreadsheet to see if this article’s URL already exists. If it finds a match, the whole thing stops. If it doesn’t, we move forward.
Again, not very optimized. There are better ways to do this.
But a spreadsheet-based duplicate check is a perfectly valid approach for an MVP. More advanced setups might use database lookups or Make.com’s built-in data stores.
I’m using a spreadsheet because I can see what’s happening, sort it, and check on things manually when something goes sideways. Which it does, because this is automation and things always go sideways eventually.
Step 3: Scraping the Actual Content
So now we know the article is new and we haven’t processed it before. Next up is grabbing the actual content from that URL.
I use a custom web content scraper module for Make.com. It’s not publicly available; you’ll have to download it, but it’s from Hassan. He’s a YouTuber and he created this module.
You install it into your Make.com account, plug in the API key, and it scrapes the article content from whatever URL the RSS feed gave you.
Once scraped, the content gets added as a new row in the spreadsheet. So to be clear about the order here: the first spreadsheet module searched for duplicates. Nothing was duplicated, so we continued.
The scraper grabbed the content. Now we add a new row with the URL and the scraped text. This way everything is logged and I have a record of what went in.
The custom scraper module isn’t a default Make.com feature; it’s a third-party community module. Only install modules you trust, and double-check the API key setup before you run your first scenario.
Step 4: The AI Content Chain (Anthropic, Then More Anthropic)
Here’s where it gets fun. And where I have strong opinions.
I use Anthropic, not ChatGPT, because it creates better content for me. Full stop. For the main content writing, Anthropic just produces more natural-sounding articles. My prompt isn’t the biggest in the world and it’s not the shortest. It has my tone of voice and exactly what I want in specifically a news article, not a blog post.
The way this works in Make.com is through variables. The scraper creates a variable with all the content, and then the Anthropic module references that variable.
So it’s grabbing content from here, throwing it to here. How does it know what it is? It’s because it’s a variable.
Now here’s the part most tutorials won’t tell you: one AI pass isn’t enough.
Because there’s no really good way to chat with Make.com right now for these types of systems, you just have to pile up a bunch of these on top of each other. So my first Anthropic module creates the initial content.
Then a second Anthropic module takes that output and refines it, adding things, removing things, making sure it reads well. Each module creates a new variable that the next one can reference.
And ideally, you would make three or four more of these checks because then after this finishes, it’ll make a new variable, and then you would create a new Anthropic module saying, hey, now with all this perfect content, do more stuff. That’s how you stack quality.
Anthropic vs. ChatGPT (How I Split the Work)
Anthropic (Claude)
- Best for main content writing and longer news-style rewrites
- Stronger at maintaining a natural tone across paragraphs
- Worth the higher cost when output quality matters most
ChatGPT (OpenAI)
- Best for titles, slugs, excerpts, and short structured tasks
- Cheaper per call, so it’s good for high-frequency metadata steps
- I use it where I don’t need Claude’s writing horsepower
Step 5: Titles, Slugs, Excerpts, and Why I Use ChatGPT for the Small Stuff
After the content is finalized through the Anthropic chain, I switch to ChatGPT for the metadata: post title, URL slug, excerpt, all generated by separate ChatGPT modules.
Why the switch? ChatGPT is a little bit less expensive, so I use ChatGPT for the title and the smaller stuff. API calls add up fast, and for something like generating a 10-word title or a URL slug, you don’t need Anthropic’s horsepower. ChatGPT handles these smaller tasks just fine at a fraction of the cost.
Each of these modules creates its own variable. Pay attention to the fact that these are all creating different variables which I’m going to reference. So when you see result in Make.com, it matters which module’s result you’re pulling.
Results are module-specific. Result 20 and result 2 are completely different responses from completely different modules. Getting these mixed up is exactly how you end up with your blog post title as your URL slug and your excerpt as your title. Ask me how I know.
Step 6: DALL-E Image Generation and WordPress Publishing
For images, I have to upload the image that DALL-E created and then upload it to WordPress and then that WordPress image is then uploaded to the post. It’s a two-step media process: generate the image, upload it to WordPress as media, then attach that media to the post you’re creating. Two steps, one featured image.
The WordPress module takes all those variables you’ve been building—the Anthropic content, the ChatGPT title, the slug, the excerpt, the DALL-E image—and assembles them into a complete post. Published. Live. On your blog.
After publishing, I use a Get Post module to grab the post ID, which lets me update my spreadsheet with the editor URL and the live post URL. Then I mark the row as complete.
And then, this is my favorite part, it shoots me an email letting me know that a new article has been published with the post title. So I get a little notification, click through, and see my new article sitting there on the blog. That loop feels great.
The email notification is clutch. It takes two minutes to set up and gives you peace of mind that the automation actually ran, plus it’s your trigger to review for hallucinations before too many people see it.
The Honest Truth About Hallucinations and Editing
I’m not going to pretend this system is hands-off. I actually sometimes want to compare what it created to what it gave me, and sometimes you’ll find that whoa, that actually had some hallucinations, which is really not good. Accuracy matters for news.
That’s why I keep the original scraped content in my spreadsheet. I can pull up the source material and compare it side-by-side with what the AI produced. Did it invent a skater’s name? Did it fabricate a competition result?
Did it just… make something up? It happens. And when you’re publishing news, accuracy matters way more than it does in a generic blog post. Keep the source text.
As one automation expert put it pretty bluntly, automation without intention is just noise amplification. If you’re pumping out AI-generated articles with zero oversight, you’re not building a content system.
You’re building a misinformation machine. So edit your stuff. Even a five-minute scan catches the worst offenders. Do a quick review.
What I’d Do Differently (But Haven’t Because It Works)
Look, I know this system has gaps. I don’t have any filters for location or date or anything like that. But I’m sure that there are some settings that you can do, and it’s something that I just haven’t worked on a lot. It’s a proof of concept, but it does work.
There’s a router in my workflow that isn’t doing anything useful right now—that’s if I wanted to publish this to multiple things—but I guess I could technically just delete this altogether.
There’s a markdown-to-HTML parser I’m not sure works properly. There’s a regex filter that I added and honestly, if you don’t know what that is, don’t worry about it; it doesn’t matter. It’s a little messy.
And you know what? All of that is fine. The system publishes articles. The articles look good. The images are there. The news is mostly accurate after a quick review.
That’s an MVP. That’s a minimal viable product that I can improve over time instead of spending three months building something “perfect” that never launches. Ship the ugly version.
“This is not optimized… but it just works.” That MVP mindset beats perfection every time.

Frequently Asked Questions
Absolutely. The topic is determined entirely by your Google News search query and your AI prompts. Swap “skateboarding” for “indie game development” or “sourdough baking” and the whole pipeline works the same way. Your RSS feed is the source.
It depends on volume, but it’s surprisingly affordable. Make.com’s free tier gives you 1,000 operations per month. RSS.app has free plans too. Your main costs are Anthropic and OpenAI API calls, which for one article every three hours could run roughly $15–30/month depending on article length, the models you choose, and how many refinement passes you add. Check Anthropic and OpenAI for current pricing.
The scraper will grab whatever content it can access. If the article is fully paywalled, you’ll likely get a fragment or nothing useful.
Your AI module will then try to write an article from garbage input, and the result will be garbage output. Paywalls break the pipeline, which is why the email notification and manual review step matter.
No. Zero coding. Make.com is entirely visual: you’re connecting modules and mapping variables by clicking, not writing code. The closest thing to “code” in my whole setup is a regex expression that I’m not even sure works. You can stay no-code.
Google’s stance focuses on quality and helpfulness, not whether a human or AI wrote it. If your AI content is helpful, accurate, and adds value for readers, you’re fine.
If you’re pumping out 50 low-quality articles a day with no review, that’s a different story. Quality beats the label.
Final Thoughts
This whole workflow is proof that you don’t need to be an automation expert to build an automated AI content manager with Make.com. I built this thing while learning Make.com. I fixed a bug on camera during the walkthrough.
There are modules in there I don’t fully remember the purpose of. And it still publishes solid, image-rich news articles to my WordPress blog every three hours without me touching it. It runs without me.
So stop waiting for the perfect setup. Build the ugly version. Watch it break. Fix the thing that broke. Let it run. I don’t have too many answers though. I’m not a pro at Make.
It’s just me trying to figure this out as we go along. And honestly? That’s the best mindset for actually shipping something. Get your MVP live, improve it over time, and let the robot do the boring work while you focus on the content that actually needs you. That’s the whole game.
Sources and References
- Make.com RSS Integration Documentation
- RSS.app Guide: How to Use RSS Feeds with Make.com
- RSS.app Advanced RSS Builder Guide
- How to Build a No-Code AI Newsletter Curator Using Make.com and RSS Feeds
- Building an RSS Feed News Aggregator with AI
- Google Search Essentials: Spam Policies
- Anthropic Claude Model Pricing
- OpenAI API Pricing


















