I Audited an MSP Marketing
Agency's Website.
They Sell SEO but Their
Own Site Has Zero Schema.
By Lesli Rose · April 5, 2026 · 10 min read
This is a B2B marketing agency that specializes in one thing: helping managed service providers grow. MSP SEO, MSP PPC, MSP content marketing, MSP web design. They've built 120+ blog posts, 14 detailed case studies, and a branded four-step methodology they position as proprietary. The niche focus is sharp. The content volume is impressive. The positioning is clear.
Then I looked at the source code.
Zero JSON-LD schema markup. On any page. No Organization schema, no Service schema, no Article schema on 120 blog posts, no FAQPage schema, no BreadcrumbList. Nothing. This is a marketing agency that sells SEO services to other businesses -- and their own website has zero structured data telling Google or AI systems what they do, who they serve, or why they're credible.
These are the gaps that even experienced marketers miss. And they're the same gaps that determine whether your agency gets recommended by ChatGPT, cited in AI Overviews, or surfaced in Perplexity -- or whether you're invisible to the systems your future clients are already using to find vendors.
The Scores
40
Technical SEO
58
On-Page SEO
75
Content
5
Schema
8
AI Discoverability
30
Social SEO
35
Earned Visibility
Content at 75 reflects the real effort this agency has put into blog posts and case studies. But Schema at 5 and AI Discoverability at 8 tell the real story: all that content exists in a format that only humans who already found the site can use. Machines -- the systems that increasingly decide who gets recommended -- can't make sense of any of it.
Finding #1: 120 Blog Posts, Zero Article Schema
This agency has published over 120 blog posts about MSP marketing. Topics like "how to generate leads for your MSP," "SEO strategies for managed service providers," and "why MSPs need content marketing." The content is topically relevant, well-organized, and clearly written for their target buyer.
But none of it has Article schema. No author markup. No datePublished. No dateModified. No headline structured data. Yoast SEO generates the sitemap and basic meta tags, but it doesn't automatically add the JSON-LD Article schema that Google and AI systems use to understand, categorize, and cite blog content.
When someone asks ChatGPT "what are the best MSP marketing agencies?" or "how should I market my managed service provider business?" -- the AI pulls from sources it can structurally identify as authoritative articles. Without Article schema, these 120 posts are just HTML text. They might rank in traditional Google results, but they're nearly invisible to the AI recommendation layer.
Finding #2: 14 Case Studies with No Structured Data
Case studies are the most powerful conversion asset a B2B agency has. This agency has 14 of them -- real client results, real numbers, real outcomes. These are the pages that close deals. A prospect who reads a relevant case study is significantly more likely to book a call.
None of them have Article schema, CaseStudy schema, or any structured data at all. When an MSP owner asks an AI system "show me case studies from MSP marketing agencies," these 14 pages are invisible to that query. The content exists, the results are real, but the structured data layer that AI systems use to find and cite this kind of evidence doesn't know they exist.
The business cost. Every case study without schema is a missed chance to be cited by AI when a prospect is actively evaluating agencies.
The fix. Article schema on each case study with headline, author, datePublished, and a description that includes the client industry and results achieved.
Time to implement. About 2 hours for all 14, if done with a template.
Finding #3: A Branded Methodology Nobody Can Find
This agency has a branded four-step methodology they position throughout the site. It's their differentiator -- the thing that's supposed to make a prospect think "this agency has a system, not just freelancers doing random tasks." Smart positioning.
But there's no HowTo schema describing the steps. No structured data of any kind wrapping this methodology. When someone asks an AI system "what should I look for in an MSP marketing agency?" or "which agencies have a proven process for MSP marketing?" -- this branded methodology is invisible. AI systems can't find it, can't parse it, and can't cite it.
A proprietary methodology is only a differentiator if the systems that recommend agencies can find it. Right now, it's a nice graphic on a page that only existing visitors see.
Finding #4: AI Crawlers Blocked by Default
The robots.txt file has a 10-second crawl delay and zero AI crawler directives. No rules for GPTBot, ClaudeBot, PerplexityBot, or any of the AI systems that are increasingly driving discovery and recommendation. There's no llms.txt file either -- no structured summary of the site for large language models.
The robots.txt also blocks /*/page* -- which prevents crawling of paginated blog archives. With 120+ blog posts, paginated archives are how crawlers discover older content. Blocking pagination means a significant chunk of their content library may never get crawled.
No AI crawler directives. GPTBot, ClaudeBot, and PerplexityBot have no guidance on what to crawl or cite.
10-second crawl delay. Slows down all crawlers, including Googlebot.
Pagination blocked. Older blog posts may be inaccessible to search engine crawlers.
No llms.txt. AI systems have no structured summary of what this agency does or why it's credible.
Finding #5: Service Pages Without Service Schema
This agency offers MSP SEO, PPC management, content marketing, web design, and lead generation. Each has its own service page. None have Service schema telling search engines or AI what the service is, who it's for, or what it costs.
There's also no FAQ schema on service pages -- despite several pages having FAQ-style content already on the page. The questions and answers exist in the HTML, but they're not wrapped in FAQPage schema, so Google can't show them as rich results and AI can't cite them as direct answers.
Internal linking between service pages is also weak. Each service page exists in its own silo without strong cross-links to related services. When a crawler lands on the SEO page, there's no clear path to the content marketing or web design pages. This limits the topical authority signals that help both traditional search and AI systems understand the full scope of what this agency offers.
The Tech Stack Problem
This agency runs WordPress with Elementor as the page builder and NitroPack as the performance optimization plugin. This is a common stack, but it creates compounding problems for both SEO and AI visibility.
Elementor generates heavily nested, bloated HTML. A simple section that should be 10 lines of clean markup becomes 50+ lines of nested divs with inline styles and data attributes. This makes it harder for crawlers to identify the actual content on the page.
NitroPack compounds the problem. It's supposed to improve page speed, but it does so by obfuscating the HTML -- lazy-loading content, deferring scripts, and restructuring the DOM in ways that make the source code nearly unreadable to AI crawlers. Their reviews and testimonials page is entirely client-side rendered, meaning search engines and AI systems see an empty container instead of the actual testimonials.
The real cost of this stack. WordPress + Elementor + NitroPack requires constant plugin updates, compatibility testing, and performance monitoring. Every plugin is a dependency. Every dependency is a potential point of failure.
What a modern stack gives you. A framework like Next.js provides server-side rendering by default (no NitroPack needed), clean HTML output (no Elementor bloat), built-in image optimization, and direct control over schema, structured data, and AI directives -- without plugins.
The business case. Faster pages mean more leads. Less maintenance means more profit. Better SEO control means actually practicing what you sell to clients. For an agency that positions itself as an SEO authority, the tech stack is part of the credibility story.
Sitemap Hygiene Issues
The sitemap includes 7 thank-you pages that should be noindexed. These are post-form-submission confirmation pages with no SEO value. Submitting them to Google wastes crawl budget and tells search engines to index pages that provide zero value to searchers.
The sitemap also shows lastmod dates that suggest bulk updates -- many posts updated on the same day rather than individually revised. Search engines use lastmod as a freshness signal, and when 30 posts all show the same update date, it looks like a bulk metadata change rather than genuine content improvement. This can dilute the freshness signal that would otherwise help individual posts rank.
What's Actually Working
Sharp niche focus. MSP marketing is a specific, well-defined niche. This agency isn't trying to be everything to everyone. The topical focus is exactly what AI systems reward when they can identify it through structured data.
120+ blog posts. Real content volume in a single niche. This is the kind of topical authority that ranks -- when the technical infrastructure supports it.
14 case studies. Concrete proof of results. Case studies are the content type that most directly influences B2B buying decisions and AI recommendations.
Branded methodology. A four-step proprietary process gives prospects a framework to understand how the agency works. This is strong positioning that differentiates in a crowded market.
Clear service pages. Dedicated pages for each service offering, with content tailored to the MSP audience. The structure is there -- it just needs the structured data layer on top.
The Irony, and Why It Matters
This agency sells SEO to managed service providers. Their pitch is essentially: "You're great at IT services but you don't know how to market yourself -- let us handle that." The irony is that the same gap exists on their own site. They're great at content production but they haven't addressed the structural layer that makes content visible to AI systems.
I'm not saying this to mock anyone. These are gaps that even experienced marketers miss because the landscape has shifted. Two years ago, schema was a nice-to-have. Today, it's the difference between showing up in ChatGPT and being invisible to the fastest-growing discovery channel in search history.
The content is there. The expertise is real. The niche authority is genuine. What's missing is the machine-readable layer that lets AI systems find, understand, and recommend this agency to the MSP owners who are already asking AI for vendor recommendations.
Frequently Asked Questions
Why would a marketing agency's own site have no schema?
It happens more often than you'd think. Agencies focus on client deliverables and neglect their own site. WordPress plugins like Yoast generate basic meta tags, so teams assume schema is handled -- but Yoast doesn't automatically add Organization, Service, or Article schema. Without someone specifically auditing and implementing structured data, agency sites end up with the same gaps they'd flag on a client's site.
Does the tech stack really matter for SEO?
Yes. WordPress with Elementor and NitroPack creates layers of bloated HTML, client-side rendering, and JavaScript obfuscation that make it harder for search engines and AI crawlers to parse content. A modern framework like Next.js gives you server-side rendering by default, clean HTML output, built-in image optimization, and direct control over structured data -- all of which translate to faster load times, better crawlability, and stronger AI discoverability.
Can WordPress sites compete with modern frameworks for SEO?
WordPress can rank well when configured properly. But page builders like Elementor add significant HTML bloat, and optimization plugins like NitroPack can actually hurt AI crawlability by obfuscating the page source. For agencies that need full control over schema, structured data, rendering, and AI directives, a modern framework removes the plugin dependency layer and gives you direct control over everything that matters for visibility.
How much does zero schema cost a B2B agency in lost leads?
Without schema, your services don't appear in Google rich results, AI systems can't confidently recommend you, and your case studies are invisible to the structured data layer that powers AI Overviews. For a B2B agency selling $5,000-$10,000/month retainers, even one lost deal per quarter from poor AI visibility is $20,000-$40,000 in annual revenue. The cost of implementing schema is a fraction of what zero schema costs in missed opportunities.
Is Your Agency Invisible to AI Too?
I'll audit your site the same way -- technical SEO, schema, AI discoverability, earned visibility, and a clear roadmap. Free, no commitment.
Get Your AI Visibility Audit