Why Most SEO Fails – and Why I Built the MX Framework

I used to see it over and over.
A company would come to the agency I worked at, often months after launching a shiny new website. The brief was always the same:
“We’ve just rebuilt the site – now we need you to do some SEO on it.”
They thought SEO was a layer you sprinkled on after the fact. A few audits. Some content calendars. A handful of backlinks. Box ticked.
Except the new site was already broken.
Not visually. Visually it was flawless – a slick UI, dynamic interactions – but that wasn’t the problem.
The problem was structural. “Nothing more than one click away”, because UX best practice said so – and Google says “build for the user”.
That’s what Google (the business) says, but what does Google (the system) actually see? What does it do?
Google doesn’t see beauty. It doesn’t admire animations or layouts. It parses text. It follows links. It visits pages with a different agenda: to determine relevance and assign authority.
After almost every redesign, the same problem emerged: authority – the thing that actually drives rankings – was spread so thin that business-critical pages would no longer attract meaningful click-through. The homepage was usually marginally stronger than the rest, but the entire system was starved.
Left unchecked, the outcome was predictable: traffic would tank, panic would set in, and the SEO team would be left holding the bag.
Fixing it meant something nobody expected: re‑architecting the site they’d just paid for. I had to map out entirely new internal structures, prototyping machine‑aware navigation using their existing UI components so development could follow a like‑for‑like blueprint.
Every time, the pushback was the same: “We already paid for this design – why do we need to rebuild it?”
Because the problem wasn’t the user journey.
The problem was the machine journey.
UX described how humans navigated the site.
The way Google crawled and allocated authority was invisible – and nobody was accountable.
That accountability gap needed a name – I called it Machine Experience (MX).
From Observation to Framework
At first, Machine Experience (MX) was narrowly focused on one thing: authority flow.
It was the parallel “experience” that machines like Googlebot had when crawling a site – following internal links, redirects, and navigation in the same way users followed visual journeys.
MX meant designing those machine journeys deliberately – in sync with user journeys – so authority flowed where it was needed without undermining the human experience.
Over time, it became clear that this was only part of the picture.
Search evolved. Google’s algorithms added new layers: RankBrain, which weighted content relevance and engagement; and now the emerging LLM layer (increasingly labelled as ‘GEO’ – ‘Generative Engine Optimisation’), where AI-first search is reshaping information discovery entirely.
What started as a focus on authority flow has grown into a broader execution framework with three touch points:
- Authority Flow (site-level) – Routing PageRank internally via links, redirects, and indexation controls – this site‑level layer is what I call the MX Engine: the core foundation of the MX framework.
- RankBrain Alignment (page-level) – Optimising content and page layouts to match query intent and amplify engagement signals.
- Emerging LLM Layer – structured data and machine-readable content.
Together, these form the MX Framework – the execution layer of the Sovereign SEO strategy:
- Sovereign SEO (Strategy) – View your website like an investment portfolio: treat pages as assets or liabilities, and authority as capital to be allocated with intent.
- Machine Experience (Execution Framework) – Design websites for machines and humans, hitting all three touch points:
- Authority Flow – MX Engines
- RankBrain Alignment – User Engagement Signals
- Emerging LLM Layer – Generative Engine Retrieval Stacks
MX has evolved into a three‑layer execution framework, but its foundation – the MX Engine – and its neglect – is why most SEO fails.
Without it, nothing else compounds.
The Invisible Engine Driving (or Killing) Your SEO
Every site has an MX Engine – whether you realise it or not.
It’s the sum of your internal links, redirects, canonicals, and navigation structures. It decides how authority flows through your site – which pages become strong enough to rank, and which sink into obscurity.
Think of authority like capital. Backlinks are capital injections – money into the system. Internal links are allocation decisions – they determine where that capital flows.
The MX Engine is your portfolio manager – deciding where that capital goes. If it routes to your key service and product pages (your assets), performance compounds. If it leaks into compliance pages, blog bloat, or forgotten category pages (your liabilities), the system bleeds dry.
Most websites bleed.
Why?
Because the people building them don’t even know this engine exists – let alone how to design it. They treat every page as equal, every link as harmless – but the MX Engine is zero‑sum: every link is a trade-off.
Every page you link to takes authority from somewhere else.
Ignore that – and you starve the very pages that should be driving your business.
The Zero-Sum Nature of Authority
Most people don’t understand how authority actually works – or what they’re giving away when they link to everything.
Let’s be clear on one thing: authority is finite. That’s what zero-sum means.
When a backlink lands on your site, it injects authority – capital – but that capital doesn’t stay where it lands. Only a small portion remains on the page.
The rest is distributed via the MX Engine.
Here’s a simplified model of what this means in practice:
Imagine a flat 10-page website. Every page links to the other 9.
A backlink worth $100 lands on Page-1.
Page-1 keeps 15% of that capital ($15.00) and redistributes the remaining 85% ($85.00) equally to the other 9 pages – each receiving $9.44.
Each of those pages keeps 15% of the capital injected from the link from Page-1 ($1.42), and redistributes the rest ($8.02) across the other 9 pages – including Page-1, which receives $0.89 (rounding down to the nearest cent) back from each of them. Multiply that by 9, and Page-1 gets $8.02 recycled back.
So Page-1 ends up holding $23.02 ($15.00 directly from the inbound link + $8.02 returned by the MX Engine) while every other page ends up with $8.55 (no inbound links, all their authority is generated passively via the MX Engine).
Here’s how that looks:
10-Page Flat MX Engine (Equal Linking)
Page | Backlinks | Value of Backlinks | Authority Kept (15%) | MX Engine Return | Total Authority Held |
Page-1 | 1 | $100.00 | $15.00 | $8.03 | $23.02 |
Other Pages (each) | 0 | $0.00 | $0.00 | $8.55 | $8.55 |
Now increase the page count to 100. Every page links to every other – flat structure, just scaled up.
The same $100 lands on Page-1 – but now it distributes across 99 links instead of 9. Each page receives just $0.86 from Page-1.
From that, they keep $0.13, and pass on the remaining $0.73 – only $0.007 of which comes back to Page-1.
Multiply that by 99 and Page-1 receives just $0.73 back – compared to the $8.03 it received in the 10-page setup.
100-Page Flat MX Engine (Equal Linking)
Page | Backlinks | Value of Backlinks | Authority Kept (15%) | MX Engine Return | Total Authority Held |
Page-1 | 1 | $100.00 | $15.00 | $0.73 | $15.73 |
Other Pages (each) | 0 | $0.00 | $0.00 | $0.85 | $0.85 |
The recycled capital didn’t disappear – it just got spread so thin it lost all impact.
Even though the structure is flat in both models, the performance outcome is drastically different. That’s because authority is zero-sum.
In the 10-page setup, a single $100 backlink brings Page‑1 to $23.02 of total authority. In the 100-page version, that same link only takes Page‑1 to $15.73.
For Page‑1 to reach the same valuation under the 100-page structure, it would need 10x the number of backlinks – or 10x the budget.
Put simply: a $5,000/month link budget under the 100-page structure performs 50% worse than a $1,000/month budget under the 10-page model.
Structure isn’t hygiene – it’s leverage. The MX Engine doesn’t just distribute capital. It determines your return on investment.
This is where most ‘best practice’ SEO collapses:
Content bloat – More pages, no more performance. Even if you increase authority through backlinks, every new page takes a cut. The result is that important pages get an ever-decreasing share. Agencies push more content to signal EEAT, target snippets, or “support” AI visibility – but structurally, they’re shrinking the slice that matters.
Linking everything to everything – Sold as UX best practice (“nothing more than one click away”) and SEO hygiene (“every page needs links to rank”) – but these links aren’t free: every link is a redistribution. Every link is a trade-off.
No prioritisation – Authority gets diluted across the site, and nothing compounds.
That’s the zero-sum nature of authority. The more you dilute it, the less impact it has.
The Zero-Sum Nature of Click-Through
Being indexed isn’t enough.
That’s the next hard truth that so many SEO teams miss.
Just because a page appears in the SERPs doesn’t mean it gets seen – let alone clicked.
Only a handful of ranking positions earn meaningful traffic. The rest are routinely ignored by users.
The top three positions capture the majority of click-through. By the time you reach position 10, you’re chasing crumbs. Beyond that, you’re functionally invisible.
Here’s how that reality plays out:
2025 Organic CTR by Google Ranking Position
Source: First Page Sage (May 28, 2025): Google Click-Through Rates (CTRs) by Ranking Position in 2025
Rank | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 |
CTR | 39.8% | 18.7% | 10.2% | 7.2% | 5.1% | 4.4% | 3.0% | 2.1% | 1.9% | 1.6% |
Note: Ads in positions 1–4 generate between 1–2% CTR each. That means even bottom-of-page organic results outperform most paid placements.
The CTR game is played out on the first page of Google – in the top 10 results – the only parts of the SERPs that matter.
For one result to gain clicks, another has to lose them. That’s what makes it zero-sum. If you move up, someone else gets pushed down.
Bump a competitor from position 1 to 2, and you’ve cost them 50% of their traffic from that keyword.
Asset Classes – Mature, Potential, and Liabilities.
In Sovereign SEO, we classify pages as either assets – the ones that drive revenue – or liabilities, which absorb authority without return.
That binary alone is a simple distinction, but it isn’t the full picture. There’s a third class that exists: potential assets.
A page isn’t an asset just because you want it to be. It only becomes one once it clears the performance threshold: enough authority to break into positions that earn meaningful click-through for its high-intent keywords.
Until then, it’s a potential asset: technically sound, but functionally invisible because it lacks the capital to compete.
At the top of the hierarchy are mature assets – the handful of service, product, or category pages that have cleared the required authority thresholds to achieve meaningful click-through.
Mature assets aren’t just getting traffic, they’re compounding. Each visit and conversion they generate justifies more investment: off-site authority work to inject more capital into the system, improved landing page experiences, MX Engine reconfiguration sprints. That reinforcement drives even greater returns.
At the bottom of the hierarchy are liabilities – pages that should never compete for authority in the first place: login forms, legal disclaimers, careers pages. Some are necessary for compliance reasons, others may exist for credibility purposes – but they don’t generate search performance. If the MX Engine routes capital into them, it’s wasted.
Together, these three classes form the investment lens of Sovereign SEO – the framework for deciding which pages to protect, which to isolate, and which to mature.
Asset Classes in Sovereign SEO
Class | Definition | Threshold Status |
Mature Assets | Revenue-driving service, product, or category pages that generate leads and sales. | Have cleared the authority threshold → ranking in positions with meaningful click-through. |
Potential Assets | The same as mature assets, but lacking the requisite authority to break into viable positions. | Below the authority threshold → technically viable, but trapped outside CTR range. |
Liabilities | Compliance or credibility pages (e.g. policy pages, disclaimers, login). | No threshold relevance → never intended to compete for click-through. |
Most sites sit heavily weighted toward potential assets. Hundreds of ‘optimised’ pages that technically serve as SEO landing pages, but receive such a small fraction of the site’s overall authority that they never cross the line into viability. They remain potential assets, trapped below the threshold. Without sufficient authority in the system, they might as well be liabilities.
There are only two ways to deal with this imbalance:
Massively increase off-site authority activity to keep pace with the volume of potential assets – a linear and costly approach – or structure creatively, like a tax advisor would structure an investment portfolio:
- Identify and protect mature assets. Concentrate authority so they stay dominant.
- Ringfence liabilities. They still need to exist, but isolate them from the flow of capital.
- Defer potential assets. They only mature when the system has enough authority to support them.
This is the compounding loop at the core of Sovereign SEO.
Treat authority as capital. Allocate it deliberately.
Mature as many potential assets as the site’s authority can sustain – starting with the highest revenue-driving pages and working down.
Use the revenue and authority they generate to bring the next cohort into maturity.
Repeat the cycle.
It’s controlled compounding – not blind publishing.
A Systemic Failure to Prioritise
Most websites never make this shift. They just keep adding more pages – landing pages, blog posts, policy pages – and keep adding links.
Every addition dilutes the system. Instead of ten strong pages (mature assets), they end up with zero strong pages and a hundred weak ones (potential assets).
On the surface, prioritisation failure looks like a self-reinforcing loop:
- The “more is better” trap – Human psychology favours expansion: more content, more navigation items, more compliance policies, more boxes ticked. Tangible ‘growth’ looks and feels like progress, even when ROI is flatlining.
- UX & CMS defaults – Platforms reinforce this instinct. Mega menus, advanced footers, AI generated content – all features that encourage bloat. Sold as premium add-ons, creating the illusion that they’re ‘barriers-to-entry’ – and therefore “must be good for SEO”.
- Soft KPIs – Search Console impressions and third-party dashboards actually validate this illusion, counting ‘visibility’ at position 70 as ‘success’. “Graph goes up – and to the right”! In reality, those pages are invisible to users and may as well be liabilities.
In reality this loop is just the tip of the iceberg – the deeper reasons behind prioritisation failure are systemic.
Which brings us to Charlie Munger.
Few people understood systemic dysfunction better than him – and his mental models explain why these failure patterns are a repeating problem in SEO.
Mental Models – Charlie Munger
Charlie Munger – Warren Buffett’s long-time business partner – wasn’t an SEO, but his mental models for decision-making are some of the sharpest tools for understanding why industries succeed or fail.
Three in particular map directly onto the dysfunction we’ve just outlined:
- Skin in the Game – Who actually bears the consequences of their decisions?
- Show Me the Incentives, I’ll Show You the Outcomes – People follow their incentives, not ideals.
- Invert, Always Invert – To solve a problem, flip it and ask: how would you guarantee failure?
Applied to SEO, these models make the dysfunction obvious, and the first – skin in the game – explains why the wrong people end up configuring the MX Engine.
No Skin in the Game: The Wrong People Configure the MX Engine
When you hire a UX or dev firm to rebuild your site, you’re unknowingly putting them in charge of your MX Engine.
They’re the ones defining navigation. Deciding what links where. Choosing whether authority flows through a clean hierarchy – or gets scattered across hundreds of dead ends.
The problem? They’re not measured on rankings.
Their KPI is aesthetics and delivery:
- Does it look modern?
- Does it load quickly?
- Did we ship on time?
They probably used the same template they used for a dozen other websites. Just different colours, different fonts, different images.
If the site launches and organic traffic collapses three months later? That’s not their problem.
This is pure Munger: no skin in the game.
Here’s the equivalent:
Would you ask your technical SEO person to script and produce your TV ad campaign – then hand it back to your creative director with instructions to “pick the outfits for the actors, and make it work”?
Of course not.
Yet this is what happens in reverse on almost every SEO campaign: aesthetic designers configure the engine that drives organic performance.
Show Me the Incentives, I’ll Show You the Outcomes
Charlie Munger’s line – “Show me the incentives, I’ll show you the outcomes” – is one of the sharpest lenses for understanding why most SEO fails.
The dysfunction you see in the industry isn’t illogical. It’s the result of rational actors doing exactly what their incentives tell them to do.
Every layer of the SEO ecosystem has its own agenda – and none of them are aligned with optimal performance.
Let’s break down how incentives shape outcomes across the ecosystem – starting with the dominant player: Google.
Google’s Incentives
As a publicly traded company, Google – or more accurately, Alphabet – is incentivised to do one thing above all else: protect and grow revenue. Advertising makes up the overwhelming majority of that revenue, and it shapes every decision the company makes – from product design to the guidance it issues to the industry. Every shift in narrative, every adjustment to the rules, is downstream of that single incentive.
Incentives cause and effect:
- Maximise ad revenue – paid clicks are the business model.
- Maintain opacity – complexity and secrecy keep businesses dependent on ads.
- Control the narrative – be selective with the truth and propagate misdirection.
- Provide just enough information so that webpages can get indexed.
- Promote ‘build for the user’ and content-bloat as best practice.
- Make authority “just one of hundreds of signals.”
- Retire PageRank data.
- Retire organic keyword data.
- Retire Google Cache data.
- Retire ccTLDs and third party tracking capabilities.
Agency Incentives
On paper, agencies should be incentivised to deliver results, build case studies, and grow through reputation.
In practice, most agencies aren’t run by technical experts or channel specialists. They’re run by business people optimising for margin and predictability.
That reality channels effort into scalable deliverables, while the deeper levers of SEO are left untouched.
Incentives cause and effect:
- Protect margins – predictable revenue beats unpredictable performance.
- Under-resource accounts and overwork junior staff to lower costs.
- Prioritise outputs that scale cheaply: fast audits, blog content, low-value links.
- Set soft KPIs (impressions) that create the illusion of performance.
- Avoid structural authority work – it’s complex, requires a diverse skillset, it’s harder to price, and it doesn’t scale neatly.
- Hedge against churn – it’s easier to replace clients than retain them.
- Prioritise sales and marketing investment over advanced delivery skill sets.
- Structure contracts around one-year terms, anticipating many won’t renew.
- Use volume to hedge churn: sign two new clients for every existing client that is expected to leave.
- Overperformance risk – success can reduce revenue.
- If a client reaches #1 for all their revenue-driving terms, the agency is made redundant.
- Encourages ongoing production without ever making the client self-sufficient.
- Internal budget politics – SEO profits fund fledgling departments.
- Senior managers without SEO expertise (often from paid media backgrounds) see SEO as a back-office cash cow.
- Profits from SEO teams are siphoned into ‘sexier’ divisions – historically social media, today it’s AI – where losses are subsidised by SEO margins.
SEO Platforms’ Incentives
Platforms like SEMrush, Conductor, and Ahrefs sit in a different corner of the ecosystem – but their incentives are no less distorting.
These tools can’t build authority – the one thing that actually moves the competitive needle – so their value proposition leans elsewhere.
They focus on page-level signals and outputs that scale: keyword density, headers, content length, on-page optimisation scores.
The product is built to keep users busy, not to solve the hard problems. Users get scorecards to ‘optimise’ what already exists, then get pushed into expanding into new topic clusters, creating more pages, and repeating the cycle.
The result: prioritisation failure and content bloat disguised as progress.
Incentives cause and effect:
- Subscription growth – must always sell the “next big thing”.
- Constantly roll out dashboards for whatever’s trending (AI visibility scores, etc.).
- Package ‘ahead of the curve’ metrics even if the data is thin or outputs aren’t actionable (i.e. Moz’ ‘brand authority’).
- Retention – users must never feel ‘done’ with content creation.
- Push scorecards that show room for improvement, even when marginal.
- Encourage expansion into new topic clusters → more pages → more optimisation cycles.
- Scalability – advice must be easy to follow at volume.
- Lean on page-level signals (keyword density, headers, content length).
- Avoid guidance on harder truths: consolidating assets, ringfencing liabilities, managing authority flow. These detract from the core product: content expansion.
- Perception of progress – impact metrics must look positive.
- Soften CTR curves in visibility dashboards, ignoring the zero-sum nature of clickthrough.
- Make performance feel tangible via content creation, even when ‘100% optimised’ pages remain stuck on page five.
Industry Press & Influencers’ Incentives
The line between SEO platforms, industry press, and influencers is blurry.
The same companies selling tools are often sponsoring conferences and amplifying trends through their own blogs and social channels.
Visibility in the industry comes not from challenging the narrative, but from being aligned with Google’s messaging, ‘big tech’, and hyping whatever can be packaged as ‘new’.
In turn, proven strategies that don’t serve those alignments get sidelined, while narratives that generate clicks – not outcomes – dominate the conversation.
Incentives cause and effect:
- Prestige through proximity – clout comes from being close to Google or riding the coattails of other big platforms.
- ‘Insider’ status is built on access, not first principles thinking.
- Event speaker slots go to those who echo the event sponsors interests.
- Novelty sells – the industry rewards ‘new’ over ‘true’.
- Proven fundamentals are sidelined as yesterday’s news, despite being the primary drivers of outcome.
- Trends like ‘brand authority’ or ‘AI visibility’ get elevated as gospel regardless of accuracy or impact.
A Convergence of Misaligned Incentives
Every player in the ecosystem is optimising – just not in your best interests:
- Google optimises for ads.
- Agencies optimise for margin and churn.
- Platforms optimise for endless content cycles.
- The industry press optimises for platform hype.
Collectively, they drive the same systemic failure: content bloat and a chronic inability to prioritise.
Agencies avoid structural authority work because it’s complex and unscalable. Platforms keep users busy creating more pages to feed their dashboards. Press amplifies trends that flatter Google and the vendors who are bankrolling them.
All of these incentives create a drag-effect on SEO performance – but Google’s sit in a different category.
Agencies and platforms might misallocate effort, but their survival still depends on delivering something of value.
Google, on the other hand, is directly incentivised against organic success. The more companies depend on their ads, the more money they make.
This is why Munger’s third model – inversion – is the sharpest tool for understanding how Google frames the narrative that systematises prioritisation failure.
Invert, Always Invert: How Google Frames the Narrative
Munger’s principle of inversion is simple: to solve a problem, flip it and ask how you’d guarantee failure.
So let’s invert the SEO question. If you were Google, and your revenue depended on keeping brands reliant on ads, what would you do?
Would you teach businesses how to consistently get their revenue-driving pages into position #1 for all their keywords? Of course not. That would reduce paid clicks, and paid clicks are the business model.
Instead, you’d run what I call the Inversion Narrative.
The playbook looks like this:
- Downplay authority – the one input with infinite upside. Frame it as “just one of hundreds of signals”.
- Retire actionable data metrics – retire PageRank data, retire organic keyword data, retire cache views, retire ccTLDs. Over time, erase the practitioner’s ability to measure or manage performance.
- Elevate content – the input with the lowest barrier-to-entry and lowest ceiling. Tell everyone “great content rises to the top.” Push the narrative that publishing more is progress, even if authority gets diluted.
- Use truth to create the lie – what they tell you works (indexable content, crawlable sites) is true, but it’s only half the picture. They omit the capital layer: authority.
The result is systemic prioritisation failure.
Marketing teams follow the script:
- Publish more content.
- Link everything to everything.
They measure success by impressions or ‘visibility’, convincing themselves that rising impressions guarantee rising traffic.
They don’t, and the clicks don’t follow. Their sites swell with ‘optimised’ pages that remain stuck outside click-through range.
The result is a production line of ‘potential assets’: impressions rise, CTR collapses, and authority bleeds away from the few mature assets that once drove performance, compounding the decline.
That failure isn’t random – it’s by design.
The narrative has evolved over time. In the early days, Google was transparent. They needed market share, so they explained how the algorithm worked. Authority was openly discussed, and PageRank was published.
As their dominance compounded, the transparency was stripped away piece by piece. The narrative shifted from authority to content.
Industry actors with their own incentives were happy to amplify it:
- Agencies could churn out blogs and content audits.
- Platforms could sell dashboards and scores.
- The press could echo Google’s gospel without risking access.
Apply Munger’s inversion – replace “how do you succeed?” with “how would you guarantee failure?”- and you get exactly the system we have today: one where authority is diluted, content is inflated, and marketing teams stay on the treadmill.
That’s the Inversion Narrative. It explains why systemic failure to prioritise is the norm, and why most SEO keeps failing no matter how closely it follows ‘best practice’.
Most SEO Fails – Sovereign SEO is How You Succeed
Most SEO fails because of misaligned incentives:
- Google optimises for ad revenue.
- Agencies optimise for churn.
- Platforms optimise for content creation.
Recognise the Inversion Narrative – and stop listening to it:
- Stop relying on Google’s partial truths.
- Stop outsourcing your performance levers to teams with no skin in the game.
- Don’t settle for soft KPIs.
- Don’t be misled by another content-bloat strategy.
Sovereign SEO is about taking back control:
- Adopt the MX framework.
- Treat authority like capital.
- Prioritise assets.
- Ring-fence liabilities.
- Mature assets in sequence.
That’s the difference between treading water and building a system that compounds.
This is what Sovereign SEO was built for.
Get in touch to see how a Sovereign SEO strategy can put your most important pages into market leaders.

Mike Simpson
With nearly 15 years of experience in SEO and digital marketing, Mike has built a reputation for driving growth and innovation. His journey began at Havas Media, where he developed expertise in client management, technical auditing, and strategic planning for top brands like Tesco Bank and Domino’s Pizza. He progressed to leading teams at Forward Internet Group and IPG Media-Brands, before taking on the role of Commercial Director & Chief Product Strategist at Barracuda Digital, where he delivered significant results for high-profile clients.
Now working as a consultant, Mike leverages his extensive experience to help businesses enhance their digital strategies, delivering bespoke solutions and measurable success. His strategic insights and dedication have made him a sought-after expert in the industry.