From obscure to the Map Pack in 11 months.
Built 40+ postcode-level landing pages, cleaned up a messy schema stack, deployed a WhatsApp AI dispatch agent, earned local press across east London recovery services.
Senior technical SEO for B2B SaaS, ecommerce, marketplaces and editorial sites where the ranking gap is structural, not content. SSR & ISR rendering, Core Web Vitals at p75 field data, schema architecture, log-file analysis, hreflang, migration protection. Direct PR delivery to your engineering team. Month-to-month, senior-led, worldwide.
Receipts available on request, happy to show live Search Console on a call.
Built 40+ postcode-level landing pages, cleaned up a messy schema stack, deployed a WhatsApp AI dispatch agent, earned local press across east London recovery services.
Rebuilt an ageing site, added product & review schema, rewrote category pages in plain English.
180-page city-service template that reads human, plus a WhatsApp agent handling 60% of intake.
Four verified reviews from active engagements. Every review ships as schema.org Review markup alongside the visible quote, same claim on screen and in the structured data.
Three years in and still the best SEO money I have ever spent. Map Pack visibility across 40+ London postcodes, zero nonsense in the reporting, and I can text Syed directly when something breaks.
Organic revenue up 185% in 14 months. Product schema rebuild alone lifted rich-result capture by ~40%. No 12-month lock-in, month-to-month, which meant I could judge the work on results rather than on contract friction.
Moved from an NYC agency that billed $9k/month for junior-delivered work. Two years later, 23 practice-area terms on page one and qualified demos up 180%. Senior time, in USD, month-to-month, what US SaaS SEO should be.
Four-clinic group across Sydney. GBP work, postcode landing pages, review pipeline that actually complies with Google's rules. Patient bookings from organic up 3x in the first year. Remote but genuinely responsive.
SEO is the foundation. AI and custom web builds are how I ship outcomes in 2026, all connected, all from the same hand.
Crawl audits, schema that validates, internal linking, postcode-level landing pages, GBP, Map Pack, the foundation that makes everything compound.
Custom WhatsApp and web agents handling enquiries, quoting, booking, and dispatch. N8N, OpenAI, Gemini.
Custom sites on WordPress, Next.js, or hand-written HTML. Fast, SEO-ready, Core Web Vitals green from day one.
Topical maps that close ranking gaps. Editorial briefs your writers can follow. Digital PR that survives core updates.
Reporting, lead routing, content pipelines. If a task is repetitive and mechanical, I'll automate it with N8N.
Written SEO diagnostic with a ranked fix list. Two-week turnaround. Often the right starting point.
Replatforms, redesigns, rebrands. I protect rankings through the change, the riskiest work in SEO, done right.
Four tiers. Every tier is hand-coded, no Wix, no Elementor, no copy-paste from a template marketplace. Schema, sitemap, Search Console and Analytics configured on every project. 90+ Lighthouse speed target where technically possible. Express turnaround on sites up to 10 pages: 2 to 3 working days for an extra £500, or same-day launch for £1,000, subject to all content and brand assets supplied on day one. Lower than traditional UK agencies, because we don't carry London agency overhead.
Hand-coded 5-page site for founders validating a new business or single-service local operators.
Most common tier for growing SMEs. Full sitemap, services, about, blog shell, custom UI/UX in Figma.
Full UI/UX system plus hand-coded Next.js or WordPress build for businesses with multiple service lines.
Shopify / Saleor headless, multi-language hreflang, CRM / CMS / ERP API integrations.
The difference between a pitch deck and the people shipping your work is the difference between “scalable” and delivered.
A short introduction, your site URL, and what you’re trying to achieve. If it’s a fit, we’ll book a 30-minute call.
Technical SEO is the part of search ranking that engineers, not marketers, actually control. It is rendering strategy, internal link graph topology, schema architecture, Core Web Vitals at p75 field data, crawl budget allocation, hreflang correctness, and whether your migration kept its 301 chain intact.
Below is the working consultant's view of what moves the needle on technical SEO in 2026, written for the people who will actually implement the work.
Technical SEO is whatever your engineering team has to ship before content and links are allowed to compound. On a clean codebase with a competent rendering strategy, technical SEO is a maintenance discipline — schema validation, Core Web Vitals tuning, sitemap hygiene, occasional migration work. On a broken codebase — and most modern marketing sites built on React-flavoured frameworks are at least partially broken — technical SEO is the largest available rankings lever, often by a wide margin.
The reason is simple. Content and links compound on top of an indexable, crawl-budgeted, schema-valid foundation. Without that foundation, additional content and links push against an indexability ceiling and the ROI tails off before it should. We have rebuilt rendering on Series-A SaaS sites and watched 23 commercial terms move from page three to page one in nine weeks with zero new content shipped. That is not a content win. That is the existing content becoming visible to Google for the first time.
A crawl simulator (Screaming Frog, Sitebulb, OnCrawl) shows what could be crawled. A log file shows what actually was crawled. The two answers are usually different, and the difference is the audit. We pull thirty days of access logs, parse only Googlebot-verified hits via reverse-DNS, and produce three deliverables.
First, the crawl-allocation by template — what proportion of Googlebot's time is spent on the homepage, category pages, product pages, blog, filter variants, pagination, internal search, and the long tail of low-value URLs. On most ecommerce sites we audit, 30–50% of Googlebot's budget is spent on filter parameter combinations that should have been canonicalised or noindexed. Recovering that budget and pointing it at commercially relevant URLs is the highest-leverage technical fix on a site over 5,000 URLs.
Second, the crawl-but-not-indexed gap. URLs Googlebot visits but does not include in the index. The gap usually points at thin content, near-duplicates, weak canonical signals, or low-authority orphan pages. Triaging those URLs into fix / consolidate / noindex categories typically restores 10–25% of indexed-page count within a quarter.
Third, orphan-page discovery. Googlebot regularly finds URLs through external backlinks or sitemap submissions that have no internal links pointing to them. Those orphans accumulate into a surprising amount of latent ranking authority that is wasted because the internal link graph never tells Google they matter. Fixing the orphan map is unsexy, takes a sprint, and usually produces a visible ranking bump on adjacent pages.
On a Series-C marketplace last year, a single Excel file mapping 4,200 orphan URLs to their topical hub-page parent recovered 38% of indexed-page count and a 19% organic-revenue lift inside the same quarter. No new content. No new links. Just making the existing pages legible to Google's crawl prioritisation logic.
Most modern marketing sites are built on React, Vue, Svelte, or a flavour of those — Next.js, Remix, Astro, Nuxt, SvelteKit, Gatsby. Every one of those frameworks lets engineers ship marketing pages as client-side rendered (CSR) by default. Most of them do.
Google does render JavaScript. The problem is when. The first crawl pass is HTML-only: Googlebot ingests whatever the server returned. JavaScript rendering happens in a deferred second pass, days or weeks later, and only on a subset of the pages crawled in pass one. On deep templates — programmatic city pages, product variant pages, long-tail comparison pages — the second pass often fails to fire at all, and the page never indexes. We have rescued five-figure commercial-page-count sites where this exact failure pattern was suppressing 20–40% of total organic traffic.
Diagnostic toolkit: Google's Mobile-Friendly Test for live render-pass diff, URL Inspection in Search Console for ingestion timing, Chrome DevTools' JavaScript-disabled mode for HTML-only baseline, and Screaming Frog's JavaScript rendering toggle for crawl-time diff. The four together let you see every flavour of CSR-by-accident on a site within a working session.
Core Web Vitals is a ranking signal as of June 2021 (LCP, FID then INP, CLS) and Google ranks on field data via the Chrome User Experience Report (CrUX), not on Lighthouse lab scores. The single most common Core Web Vitals consulting failure we see in the wild is teams optimising for the lab number and ignoring the field number. They are different.
Common fixes by metric: LCP — server response time, image formats (AVIF/WebP), priority hint preloading, font subsetting, eliminating render-blocking JS. INP — JavaScript bundle size, third-party script audit, hydration cost, long task profiling. CLS — explicit width/height on images, font-loading metrics, lazy-loaded element reserved space, late-injected ad slots.
Schema markup is one of the highest-leverage technical SEO investments because it earns SERP feature placements (FAQ snippets, How-to steps, Product rich cards, Review stars, Breadcrumb links) and it disambiguates your entity for Google's knowledge graph. The mistake we see most often is teams marking everything up and hoping for the best — schema sprawl that fails Google's content rules and gets quietly ignored or, worse, gets the site flagged for structured-data spam in a manual action.
The discipline: pick the schema types that map to your actual content, implement them at the template level (not page-by-page), validate against the Rich Results Test AND Google's structured-data content rules, and monitor the SERP-feature capture rate as your scoreboard. We typically ship Organization + WebSite + BreadcrumbList sitewide, then layer Article on the blog, Product on ecommerce templates, FAQPage on relevant pages, HowTo where applicable, Service on commercial pages, LocalBusiness on franchise sites, Review and AggregateRating where genuine, Person and TeamMember for E-E-A-T.
Site migrations and replatforms are the single most common cause of permanent organic-traffic loss in B2B SaaS, ecommerce and editorial sites. Industry averages put unmanaged-migration traffic loss at 20–60%, and a meaningful fraction of that loss is permanent because the recovery window closes once Google has finished re-evaluating the new domain or template.
The work that prevents that loss is unglamorous and almost entirely pre-launch. We map every URL on the source site to its destination on the target site one-to-one, design a 301 redirect chain that preserves canonical signals through the redirect (no chains, no loops, no 302s in production), validate that schema markup migrates intact and re-validates against Rich Results, and produce a written launch-day runbook the engineering team can actually execute.
Migration projects scope at a fixed £4,500–£15,000 depending on URL count and integration complexity. The cost is small relative to the traffic loss it prevents on a serious commercial site.
Multi-region sites that target multiple English-speaking markets (UK, US, Canada, Australia) plus EU locales (Germany, France, Spain, Italy, Netherlands) almost always have at least one hreflang error. The errors silently suppress 10–25% of international rankings and most teams never notice because the Search Console International Targeting report is not in their default monitoring.
The four classic mistakes: (1) hreflang tags pointing at URLs that don't return a 200, (2) missing self-reference tags, (3) language-region mismatches (en-CA pointing at the US site), (4) inconsistent implementation across sitemap, HTML link tag, and HTTP header — Google picks one and ignores the others.
Implementation choice depends on stack. For a marketing site under 1,000 URLs, HTML link-tag implementation is cleanest and self-documenting. For sites over 10,000 URLs, sitemap implementation is the only sane option. For sites behind a CDN with edge-rule capability, HTTP header implementation gives the cleanest separation of concerns. The right answer depends on what your engineering team can maintain, not what is theoretically optimal.
Vanity rankings-per-keyword dashboards are theatre. Every technical SEO retainer we run ships with a custom dashboard connecting GA4 + Google Search Console + field CrUX + the client's CRM (HubSpot, Salesforce, Pipedrive, Attio). Five metrics run the scoreboard.
Generative Engine Optimisation (GEO) is the discipline of making your content citable and accurately summarised by Google AI Overviews, ChatGPT Search, Perplexity, and Gemini. Most of GEO is technical: structured, citation-friendly chunking, clean heading hierarchy, schema discipline (HowTo, FAQ, Article with proper author and dateModified), and avoiding the JS-rendering failures that keep your content invisible to AI crawlers as well as Googlebot. We surface GEO readiness as part of every technical audit.
If you are running an engineering-led team and want a senior technical SEO consultant who will write specs your developers can actually implement, send the brief. First calls are 30 minutes, free, always with the person who will run your account.
Every placement is negotiated and published by hand through a six-year network of editors and journalists. We never use AI bots or PBNs, they get detected, they get demoted, and your domain pays the price.
Ten contextual do-follow links from real UK and international sites with Domain Rating 50 and above. Topically relevant. Placed inside genuine editorial content, not link-farm footers. Index report delivered within 4 weeks.
Ten earned placements on national UK and US media with Domain Rating 70 and above, the kind of coverage that shifts rankings in competitive verticals and doesn't disappear in the next core update. Written, pitched, and placed by our PR team.
Google's last five core updates have all sharpened link-spam detection. Bulk-placed links from AI-generated host sites and public blog networks are being flagged faster than they can be bought. Our model is slower and costs more per link, but the placements survive every update and compound in value the longer they stay live.
Most agency SEO deliverables end at a recommendations document the client's developer never gets around to implementing. We write the schema, ship the SSR refactor, and merge the internal-link rebuild ourselves. The SEO work that needs code ships in the same sprint the audit flagged it.
Every client gets the same senior operator from first call to monthly review. Continuity is the product.
Two weeks. Crawl, keyword gap, backlink profile, on-page health. Written report, ranked fix list.
Schema, technical debt, site build or repair, internal linking. The work that makes everything compound.
Close topical gaps. Earn links honestly. Deploy AI agents where they save real hours, not just look clever.
Monthly call. Plain-English report. What moved, what didn't, what's next. Leave any time.
Syed leads the strategy and writes the monthly notes. Behind him is a tight network of expert developers and manual link-earning partners built over six years. Everything ships fast, nothing is outsourced to an AI bot that will earn your domain a penalty in the next core update.
Three things, in roughly equal share. First, audit and triage: log-file analysis, crawl simulation in Screaming Frog or Sitebulb, rendering-pass diff (Googlebot vs Chrome), schema validation against Rich Results Test and Google's content rules, Core Web Vitals against CrUX field data not Lighthouse lab. Second, written specs handed to your engineering team via GitHub PRs or Linear tickets, with acceptance criteria a non-SEO engineer can implement. Third, post-deploy verification — re-crawl, re-render, check the changes actually shipped and Google ingested them. The work is unglamorous, mostly invisible from outside, and where most of the actual ranking movement comes from on a serious site.
For senior independent work: UK £950/mo Starter, £1,800/mo Growth, £4,000/mo Franchise · US $1,200 / $2,300 / $5,100 per month · Canada CAD $1,650 / $3,100 / $6,900 per month. Pre-engagement, a fixed £500 / $700 / CAD $850 audit (refunded against the first retainer invoice if you sign within 30 days) is the right starting point for any site over 50 pages. Migration projects are scoped separately at £4,500–£15,000 depending on URL count and integration complexity.
Yes — modern JavaScript frameworks are the most common stack we work in. Next.js is a daily driver. We audit rendering strategy template-by-template (SSR vs ISR vs SSG vs CSR), surface CSR-by-accident pages that Googlebot is failing to render in the first pass, and either ship the SSR refactor ourselves via PR or hand a written spec to your engineers. Same applies to Remix, Astro, Nuxt, SvelteKit, and Gatsby. SPA frameworks without a rendering strategy (Create React App, Vite SPA marketing pages) typically need a rebuild, not a fix.
CrUX field data is what Google ranks on. Lighthouse lab scores are useful for debugging but are not the scoreboard. We pull the CrUX dataset by template (homepage, category, product, blog, service) and surface the 75th-percentile LCP, INP, and CLS, segment by device and country, and prioritise fixes by ranking impact not lab-score cosmetics. A 2.4s LCP at p75 on mobile is usually a more pressing fix than a 92 Lighthouse score on desktop, even though the latter looks worse in a deck.
Depending on the site: Organization, WebSite, Person, Article, Product, SoftwareApplication, Service, OfferCatalog, FAQPage, HowTo, BreadcrumbList, ItemList, Review, AggregateRating, LocalBusiness, Place, Review, Course, JobPosting, Recipe. The point is not coverage for its own sake — it is rich-result eligibility (FAQ, How-to, Product, Review snippets) and entity disambiguation so Google understands what your site is about. We validate every schema against Rich Results Test and Google's actual content rules, not just JSON-LD syntax.
Yes. Hreflang implementation is one of the most commonly broken parts of multi-region sites and one of the highest-leverage to fix. We audit current implementation (sitemap-based vs link-tag-based vs HTTP-header), surface mismatches and self-reference errors, design a clean architecture that scales (per-region subfolder, ccTLD, or subdomain depending on your CDN and editorial setup), and validate ingestion in Search Console's International Targeting report. Most multi-region sites have hreflang issues that suppress 10–25% of their international rankings.
Yes — migration protection is one of our core engagements. Replatforms (Magento → Shopify, WordPress → Headless, custom → Next.js, etc.) lose between 20% and 60% of organic traffic on average when run without SEO supervision, and that loss is often permanent. We run pre-migration crawls, build a 1:1 URL mapping, design 301 redirect chains, preserve canonical signals, validate schema migrates intact, and run weekly post-launch ranking checks until traffic is fully recovered. Migration projects scope at £4,500–£15,000 fixed-fee depending on URL count.
Three things a normal crawl misses. First: which pages Googlebot actually visits versus which pages it ignores — your crawl budget allocation in reality, not theory. Second: pages crawled but not indexed, the gap between discovery and inclusion that often points to thin-content or canonical issues. Third: orphan pages Googlebot found from external signals (backlinks, sitemaps) that your internal link graph does not surface. We pull 30 days of access logs, parse Googlebot-verified hits only, and produce a written report on crawl-waste and crawl-priority gaps. On sites over 5,000 URLs the findings usually justify the audit cost three times over.
Faster than content or links. Typical timelines: rendering and indexability fixes (4–8 weeks to full recrawl and ranking re-evaluation), schema and rich-results (2–6 weeks for SERP feature capture), Core Web Vitals improvements (4–12 weeks as field data updates and Google ranks on the new p75), internal-linking and canonical fixes (4–10 weeks). Migrations and large architectural changes can take 8–16 weeks to fully recover. Anyone promising faster is selling.
Five metrics, all surfaced in a custom dashboard connecting GA4 + Google Search Console + your CRM. (1) Indexed-page count over time, segmented by template — the canonical health signal. (2) Crawl-frequency from log files, by template — Googlebot prioritisation. (3) Rich-results capture rate by template — schema effectiveness. (4) Field Core Web Vitals at p75 — Google's actual ranking signal. (5) Organic-sourced revenue or qualified pipeline — the commercial scoreboard. Vanity rankings-per-keyword dashboards are not on the list.
A short introduction, your site URL, and what you’re trying to achieve. If it’s a fit, we’ll book a 30-minute call.