How I Built an AI Blog Pipeline That Researches, Writes, and Publishes Itself
This post was researched against live site data, fact-checked against canonical metrics, cross-linked to 69 existing articles, and published to my CMS — all from a single Claude conversation. Here's how the workflow works.
Most people use AI to draft blog posts, then spend an hour on the manual work. I built a custom MCP server with 33 tools that lets Claude research, fact-check, cross-link, and publish — all from a single conversation. This post is the proof.
You're reading the proof.
This blog post wasn't drafted in Google Docs, pasted into a CMS, manually formatted, then published. It was researched against live site analytics, fact-checked against canonical business metrics, cross-linked to 69 existing articles, and published directly to this website — all from a single conversation with Claude.
No copy-paste. No switching tabs. No manual formatting.
Here's exactly how the workflow works, why it's different from what everyone else is doing with AI content, and how you could build the same thing for your business.
The Problem With "AI-Assisted" Content
Most people's AI content workflow looks like this: ask ChatGPT or Claude to write a blog post. Copy the output. Paste it into WordPress or their CMS. Manually add internal links. Manually check that stats are accurate. Manually add CTAs. Manually set meta titles, descriptions, tags, categories. Hit publish.
The AI handles maybe 30% of the actual work. The other 70% is still manual — and it's the tedious 70% that makes content creation feel like a chore.
The WordPress MCP ecosystem has started to address this. Tools like FlowHunt, OttoKit, and the official WordPress MCP Adapter let Claude create and publish posts directly. That's useful.
But they're solving the wrong bottleneck.
The hard part of publishing good content isn't clicking "publish." It's making sure the content is actually good — that it references real data, links to existing content, doesn't duplicate something you've already written, cites accurate metrics, and includes the right calls to action for the right audience.
That's the problem I solved.
What My MCP Server Actually Does
I built a custom MCP server on my Next.js site at hellocrossman.com that exposes 33 tools across five categories. When Claude is connected to this server, it doesn't just have the ability to publish posts. It has full visibility into my entire business.
Analytics (12 tools)
Claude can pull my site analytics in real time — pageviews, unique visitors, traffic sources, device breakdown, hourly patterns, engagement metrics, entry and exit pages, and content type performance. It can compare periods, check what's trending, and see which pages are driving the most engagement.
Right now, for example, my site had 117 pageviews from 22 unique visitors in the last 30 days, with an average time on page of over 14 seconds. The top performing content page is my piece on turning service businesses into software. Claude knows this because it just checked.
SEO and Search Console (6 tools)
Claude has direct access to my Google Search Console data. It can see which queries drive impressions, which pages get clicks, where I rank, and — critically — where the gaps are. It identifies high-impression/low-CTR queries that need better titles, almost-top-3 queries that are close to prime positions, and quick wins where small improvements could drive meaningful traffic.
Before writing this post, Claude ran a keyword gap analysis and confirmed that no one is ranking for the intersection of "MCP server" and "blog publishing pipeline" with a custom implementation. The WordPress MCP content is everywhere. Custom MCP-powered content workflows? That's a gap.
Content Management (7 tools)
This is where it gets interesting. Claude can list all 69 published blog posts, search across all content types (blogs, case studies, tools), read any individual post's full content, and — as of today — create and update posts with every field populated.
That means before writing a single word of this post, Claude searched my existing content library to confirm I hadn't already covered this topic. It found four MCP-related posts — all focused on explaining MCP to service business owners and the market opportunity. Nothing on building a content publishing pipeline. Green light.
Case Studies and Canonical Stats (4 tools)
Every metric I cite in content needs to be accurate. Not "roughly 500 signups" — exactly 550+ signups in 48 hours for RiskPod. Not "we've built lots of products" — exactly 100 builds across 18 years.
Claude fetches these from a canonical stats endpoint that serves as the single source of truth. The same endpoint powers the stats on my homepage, my case study pages, and now my blog content. One source, zero drift.
Lead Tracking (3 tools)
Claude can see visitor engagement scores, lead activity, and full visitor journeys across sessions. This isn't used for content creation directly, but it informs content strategy — understanding which pages warm leads visit most helps me prioritise what to write next.
The Workflow: How This Post Was Made
Here's the actual sequence that produced the post you're reading.
Step 1: Research. Claude searched the web for existing content on MCP blog publishing workflows. Found plenty of WordPress MCP content (FlowHunt, OttoKit, Ghost CMS integrations). Nothing on custom MCP servers that combine analytics, SEO, search console, and content management in a single interface. Confirmed the angle is unique.
Step 2: Duplicate check. Claude searched my content library using the MCP search tool. Zero results for "blog workflow automation" or "MCP Claude content publishing." No overlap with existing posts.
Step 3: Data gathering. Claude pulled canonical stats (100 builds, 18 years experience, RiskPod's 550+ signups, FounderOS hitting £8K MRR month one), listed all 69 blog posts for cross-linking opportunities, retrieved all 5 case studies with metrics, and checked the 4 free tools for potential CTAs.
Step 4: SEO validation. Claude ran a GSC keyword gap analysis and confirmed the target keywords aren't cannibalising existing content. It checked site analytics to understand current traffic patterns and top-performing content.
Step 5: Writing. Claude wrote this post with real data woven throughout — not placeholder metrics, not approximations, but live numbers pulled from the MCP API minutes before writing.
Step 6: Publishing. Claude called the create_blog_post MCP tool with every field populated: title, slug, content, subtitle, category, tags, meta title, meta description, read time, key takeaways, CTA blocks, case study callouts, related resources, external sources, and related free tools. One API call. Published.
No copy-paste. No tab switching. No manual formatting.
Why This Is Different From WordPress MCP
The WordPress MCP integrations are useful, but they're essentially "create post" buttons with AI attached. They give Claude the ability to publish, but not the ability to make informed publishing decisions.
My setup gives Claude the full picture:
Before writing: What have I already published? What's performing well? What keywords am I ranking for? Where are the gaps? What case studies and metrics can I reference?
During writing: What's the exact RiskPod signup number? What's my actual years of experience? Which existing posts should I link to? What tools should I promote?
At publish time: What CTAs should I include? Which case studies should I callout? What related posts should I link? What external sources should I cite?
The difference is the difference between an AI that can type into a box and an AI that understands your business. WordPress MCP gives you the former. A custom MCP server gives you the latter.
The Technical Architecture
The MCP server runs on the same Next.js application as the main website. It's not a separate service — it's an additional API layer that exposes the site's own data and functionality through the MCP protocol.
Each tool is defined with a name, description, and JSON schema for its parameters. When Claude connects to the server (via the MCP integration in Claude.ai), it can see all 33 tools and understands what each one does, what parameters it accepts, and what it returns.
The tools hit the same database and services as the admin UI. When Claude creates a blog post via MCP, it goes through the same validation, the same database writes, and appears on the same pages as a post created through the admin interface. There's no separate "AI content" pipeline — it's the same pipeline, with a different entry point.
For the technically curious, the stack is React, TypeScript, Node.js, PostgreSQL, and Drizzle ORM — the same stack I use for client builds.
What This Means For Service Businesses
If you're a service business owner reading this and thinking "that's cool but I'm not a developer" — the point isn't the code. The point is the pattern.
I've written extensively about how MCP servers turn your expertise into infrastructure. How the MCP market is projected at over $10B. How your methodology is the one thing AI can't replicate.
This blog pipeline is me eating my own cooking. My domain expertise — 18 years of product development, 100 builds shipped — is encoded in my MCP server. My canonical stats endpoint ensures I never cite wrong numbers. My content search ensures I never duplicate topics. My analytics integration ensures I write about what's actually working.
Any service business can build this pattern. A recruitment firm could have an MCP server that feeds their contractor database into content creation — writing market reports populated with real placement data. A compliance consultancy could auto-generate regulatory updates cross-referenced against their client base. A training company could publish certification guides that link to their actual course catalogue.
The software product you build for your service business isn't just a client-facing tool. It's the foundation for an entire content and marketing infrastructure that runs on your real data.
The "Final 10%" Still Matters
I want to be honest about something. This workflow is powerful, but it's not fully autonomous. I wouldn't want it to be.
Before this post was published, I reviewed the outline. I made sure the angle was right. I confirmed the structure served the reader. I'll review the published version and make tweaks if the voice drifts.
This is the Final 10% in action. AI handles 90% of the research, data gathering, writing, formatting, cross-linking, and publishing mechanics. The human handles the strategic decisions: what to write, who it's for, what the angle should be, and whether the final product actually serves the reader.
That's the part that makes content worth reading. And it's the part that no amount of automation should replace.
Build Your Own Pipeline
If you're interested in building a similar system for your business — whether it's a content pipeline, a client-facing MCP server, or a full software product from your methodology — that's exactly what I do.
I've built 100+ products in 18 years. I've shipped MCP integrations, content platforms, compliance marketplaces, and analytics dashboards. The build you're looking at right now — this site, this MCP server, this publishing pipeline — is a working example of what's possible when you combine domain expertise with modern tooling.
The Discovery Sprint is the best starting point. One week, four deliverables, one clear decision on what to build.
Or if you already know what you want, book a call and let's talk about it.
Frequently Asked Questions
Can I build an MCP server without being a developer?
Not directly — you'll need technical help to build the server itself. But the specification and design is the valuable part, and that comes from your domain expertise. A BuildKits spec combined with a 30-day build can have your MCP server live in a month.
How is this different from using the WordPress MCP server?
WordPress MCP servers give AI the ability to create and publish posts. My custom MCP gives AI full visibility into analytics, SEO, search console, content library, case studies, canonical metrics, and lead tracking — 33 tools vs a handful. The AI doesn't just publish; it publishes informed content.
Does this replace human editorial judgment?
No, and it shouldn't. The workflow handles the 90% that's mechanical — research, data gathering, cross-linking, formatting, publishing. The 10% that matters — deciding what to write, who it's for, and whether the output is actually good — stays human.
What does this cost to build?
The MCP server was built as part of the overall site infrastructure. As an isolated project, a content MCP with analytics integration, CMS, and search console would be a typical 30-day build in the £15K–£45K range depending on complexity.
Can I see the MCP tools in action?
You just did. Every metric cited in this post was pulled from the MCP API during this conversation. The cross-links reference real posts found via content search. The CTAs were added programmatically. This post is the demo.