What a Technical SEO Audit Actually Checks (And What You Can Fix Today)
A technical SEO audit isn't a black box. It checks specific, measurable things — and most of the highest-impact fixes take less than a day.
Technical SEO audits have a mystique they don't deserve. Agencies charge $5,000-$15,000 for audits that check the same things every time, produce a 50-page PDF with screenshots from Lighthouse, and recommend the same fixes that haven't changed in five years.
The checks are not complicated. The fixes are usually not expensive. The value of the audit is in prioritization — knowing which fixes produce the most ranking improvement for the least effort. That's what expertise adds, and it doesn't require a 50-page document to communicate.
Here's what a technical SEO audit actually evaluates, in order of impact.
Crawlability: can search engines find your pages?
Before content quality, backlinks, or any other signal matters, search engines have to be able to find and read your pages. Crawlability issues are binary — either the page is accessible or it isn't — and they're the most common source of invisible ranking problems.
Robots.txt. Is your robots.txt blocking pages you want indexed? A single line (Disallow: /blog/) can make your entire content library invisible to search engines. Check yours at yourdomain.com/robots.txt. Every page you want indexed should be accessible. Internal pages (admin, account, API routes) should be blocked.
In 2026, robots.txt also controls AI crawler access. PerplexityBot, GPTBot, ClaudeBot, and Google-Extended each check robots.txt independently. If you're blocking them, your content can't be cited in AI answers. Cloudflare's default bot management settings have been blocking AI crawlers for some users without explicit configuration.
XML Sitemap. Does your sitemap exist at /sitemap.xml? Does it include all your important pages? Is it referenced in robots.txt? Does it return a 200 status? Does it contain only URLs that return 200 (not 404s, redirects, or error pages)?
A sitemap with broken URLs wastes crawl budget and signals poor site maintenance. We audit sitemaps by checking every listed URL against the actual server response — a script that takes five minutes to write and surfaces problems that affect crawl efficiency.
Canonical tags. Every page should have a self-referencing canonical tag (<link rel="canonical" href="...">) that tells search engines which URL is the authoritative version. Without canonicals, search engines may index multiple versions of the same page (www vs. non-www, HTTP vs. HTTPS, with and without trailing slashes) and split the authority between them.
Internal link depth. Can search engines reach every important page within 3 clicks from the homepage? Pages that are 5+ clicks deep get crawled less frequently and rank lower. Orphan pages (no internal links pointing to them) may never get crawled at all.
Indexing: are your pages in the search index?
A page that's crawlable but not indexed doesn't appear in search results. Google Search Console's "Pages" report shows which pages are indexed and which aren't, along with reasons for non-indexing.
Check for noindex directives. A <meta name="robots" content="noindex"> tag or an X-Robots-Tag: noindex HTTP header tells search engines not to index the page. This is intentional for account pages, thank-you pages, and admin routes. It's a problem when it accidentally appears on content pages — usually from a staging environment configuration that wasn't removed before launch.
Check for duplicate content. Pages with substantially similar content may be consolidated by Google into a single indexed version, with the others excluded. If your service pages share 80% of their content with only the city name changed, Google will index one and ignore the rest. Each page needs genuinely unique content to earn its own index entry.
Check for thin content. Pages with very little content (under 200 words) are increasingly excluded from indexing. A service page with just a heading and a contact form is thin content. Adding a substantive description, FAQ, and relevant details increases the page's indexing probability and ranking potential.
Technical SEO audits check specific, measurable things. The value isn't in the checks — it's in prioritizing which fixes produce the most ranking improvement for the least effort.
Google uses three Core Web Vitals as ranking signals. The audit checks:
LCP (Largest Contentful Paint): Target under 2.5 seconds. Usually fixable with image optimization and render-blocking script removal.
INP (Interaction to Next Paint): Target under 200ms. Usually fixable with JavaScript deferral and DOM size reduction.
CLS (Cumulative Layout Shift): Target under 0.1. Usually fixable with image dimensions and reserved space for dynamic elements.
Google Search Console reports field data (real user measurements) for these metrics. PageSpeed Insights provides both field and lab data with specific recommendations.
The fixes that produce the most improvement per hour of effort: image format conversion (WebP), preconnect hints for external resources, script deferral, and explicit image dimensions.
Growth follows mathematical patterns. Content, authority, and trust all compound according to predictable curves.
Presence: Does the site implement JSON-LD structured data? (87.6% of sites don't.)
Types: Which schema types are implemented? At minimum: Organization or Person, Article/BlogPosting on content pages, BreadcrumbList on deeper pages, FAQPage where applicable.
Validity: Does the structured data pass Google's Rich Results Test without errors?
Consistency: Does the schema match the visible page content? Mismatches get penalized.
Completeness: Are optional but valuable properties filled in? (sameAs, hasCredential, knowsAbout, speakable)
Structured data is the single most actionable finding in most audits. Implementation takes hours, and pages with structured data are 3.2x more likely to be cited in AI answers.
Start with critical fixes: crawlability blockers and accidental noindex directives. These often take an hour each and produce the largest ranking improvements.
Mobile usability
Google uses mobile-first indexing — the mobile version of your site is what gets crawled and ranked. The audit checks:
Responsive design: Does the layout adapt to mobile viewport widths without horizontal scrolling?
Touch targets: Are buttons and links large enough to tap accurately? (Minimum 48x48px)
Font size: Is body text readable without pinch-zooming? (Minimum 16px)
Viewport configuration: Is the viewport meta tag set correctly?
Content parity: Does the mobile version contain the same content as desktop? Hidden content (behind tabs, accordions, or "read more" that only appears on desktop) may not be indexed.
HTTPS and security
Every page should load over HTTPS with a valid certificate. HTTP pages are flagged as "Not Secure" in Chrome and receive a ranking penalty. Mixed content (HTTPS page loading HTTP resources) generates browser warnings and erodes trust signals.
The audit also checks for security headers: Content-Security-Policy, X-Frame-Options, Strict-Transport-Security. These don't directly affect rankings, but they signal site quality and prevent common attack vectors — which matters for sites in regulated industries where a security incident has compliance consequences.
The patterns that drive human behavior are as natural and predictable as atmospheric physics — if you know what to observe.
The priority matrix
Not every finding needs immediate attention. The audit value is in prioritization:
Start at the top. The critical fixes often take an hour each and produce the largest ranking improvements. The low-priority fixes are nice-to-have but won't meaningfully change your search performance.
Frequently asked questions about what a technical seo audit actually checks
How often should I run a technical SEO audit?
Quarterly for a comprehensive audit. Monthly for a quick check of Search Console for new crawl errors, indexing issues, and Core Web Vitals changes. After any major site update (redesign, CMS migration, new section launch), run an immediate audit — migrations are the most common source of technical SEO regressions.
Can I run a technical SEO audit myself?
Yes, with Google Search Console (free), PageSpeed Insights (free), and Google's Rich Results Test (free). These three tools cover crawlability, indexing, page speed, and structured data validation. For deeper analysis, Screaming Frog (free for up to 500 URLs) adds crawl-level detail. You don't need expensive tools to identify the highest-impact issues.
What's the most common technical SEO problem?
Missing or misconfigured structured data. 87.6% of websites don't implement schema markup, which means they're invisible to the systems that power rich results and AI citations. The second most common: unoptimized images causing slow LCP scores.
How much does a technical SEO audit cost?
Agency audits range from $2,000 to $15,000 depending on site complexity. The checks themselves aren't complex — the value of a paid audit is in the expertise that prioritizes fixes by business impact and catches issues that automated tools miss. For most small to mid-market businesses, a quarterly self-audit using free tools plus an annual professional review is a cost-effective approach.
Most Core Web Vitals advice tells you what the metrics are. This post tells you how to fix them — with the specific changes that produce the biggest improvements for the least effort.
We manage many websites across different industries, platforms, and budgets. That gives us something most agencies don't have: a testing lab with real traffic and real data.
Client-side analytics miss 20-40% of your data. We run three layers — client, server, and behavioral — because no single layer tells the complete story.