SEO Site Auditing: How to Fix Your Site and Rank Higher in 2026
Last updated: March 2026
Most small business websites have a handful of technical problems quietly dragging down their search rankings. Not catastrophic failures — just the accumulated debris of a site that's grown over time without anyone systematically checking it: a few broken links, some missing title tags, a page or two blocked from indexing by accident, images that were never given alt text. None of these things announce themselves. Your site just performs worse than it should, and the cause is invisible until you look.
A SEO site audit is the process of systematically checking your site for these problems and fixing them in order of impact. It doesn't require an agency, and it doesn't require expensive software. What it requires is a clear process and a crawler tool that can surface issues across your whole site in a single pass. This article walks through that process from start to finish.
Why run a technical SEO audit at all?
The argument for auditing is simple: you can't fix problems you don't know about. Sites that haven't been audited recently tend to accumulate issues gradually — a redirect that was supposed to be temporary and never got cleaned up, a noindex tag left on a page from when it was in draft, a title tag that got overwritten when a theme was updated. Each issue is minor in isolation. Together they add up.
The right time to run a technical SEO audit is after any significant site change — a redesign, a CMS migration, a content restructure — and then quarterly as a maintenance check. Catching a broken redirect chain or an orphaned page a week after it's introduced takes five minutes to fix. Finding it twelve months later, after Google has been encountering it on every crawl, takes the same five minutes to fix but you've lost the traffic in the meantime.
For new sites or sites returning to active SEO after a long break, a full audit before doing anything else is the highest-leverage starting point. There's no point building new content on top of a foundation with crawlability problems.
Start with crawlability — can Google reach your pages?
Before checking titles, content, or speed, confirm that Google can actually access your pages. A misconfigured robots.txt or a stray noindex tag can silently exclude entire sections of your site from search results, making everything else irrelevant.
Check your robots.txt file at yourdomain.com/robots.txt. It should allow Google to crawl your important pages. A Disallow: / rule left over from a staging environment is one of the more painful SEO mistakes to diagnose because the site looks completely normal in a browser — it's only crawlers that are blocked.
Check that you have a sitemap.xml, that it lists your current pages, and that it's submitted in Google Search Console. While you're in Search Console, check the Coverage report for any pages marked as excluded, blocked, or returning errors. This is first-party data from Google about what it can and can't access on your site.
Orphan pages — pages with no internal links pointing to them — are also worth finding. Google discovers pages by following links, so a page that nothing links to may never be crawled regardless of how good its content is. These often show up as campaign landing pages or old content that was forgotten rather than formally removed.
Quick check: type site:yourdomain.com into Google. The number of results gives a rough picture of how many pages are indexed. If it's significantly lower than the number of pages you know exist, something is blocking Google from reaching them.
Technical SEO checks: the foundation layer
Once you've confirmed Google can reach your pages, work through the technical layer. These are the issues that affect how efficiently Google can crawl and understand your site, independent of the content on it.
HTTPS should be standard at this point, but check that your site is fully served over HTTPS with no mixed content warnings — HTTP resources loading on HTTPS pages. These often appear after a migration and can trigger browser security warnings that damage user trust. Check your browser's developer tools on key pages, or use a crawler that flags mixed content automatically.
For site speed, run your important pages through Google's PageSpeed Insights. The tool gives you a performance score and a prioritised list of specific improvements, with estimates of how much impact each fix would have. Focus on the "opportunities" section first — these are the changes with measurable performance gains. Large unoptimised images are the most common culprit on small business sites and usually the quickest to address.
Core Web Vitals — Google's page experience signals covering loading performance, interactivity, and visual stability — are now a confirmed ranking factor. PageSpeed Insights shows your scores for each. Cumulative Layout Shift (the visual instability metric) is often caused by images without explicit width and height attributes, which is a straightforward fix once you know which pages are affected.
Redirect chains are worth checking if your site has gone through any structural changes or URL reorganisation. A chain of two or three redirects before reaching the final destination adds latency and dilutes link equity. The fix is simple — update the chain so the original URL redirects directly to the final destination in one hop.
Canonical tags tell Google which version of a page is authoritative. If your site is accessible at both www and non-www, or if URL parameters create near-duplicate versions of pages, canonical tags prevent Google from treating them as competing pages.
On-page SEO audit: titles, headings, and meta data
The on-page SEO audit covers the elements that directly affect how Google reads and categorises each page. These are also the issues most likely to have accumulated without anyone noticing.
Every page needs a unique, descriptive title tag. Missing titles let Google substitute whatever it decides is most relevant — which is rarely as good as a title you'd write yourself. Duplicate titles across multiple pages confuse Google about which page should rank for a given query. Title length matters for how they display in search results — aim for 50–60 characters. A crawler will find all three problems across your entire site in a single pass; checking manually page by page isn't practical on anything larger than a handful of pages.
Meta descriptions don't directly affect rankings but they affect click-through rates. A well-written meta description tells a searcher exactly what they'll find. A missing one means Google pulls arbitrary text from your content, often something useless. Check for both missing and duplicate meta descriptions.
Every page should have exactly one H1 tag — the main declared topic of the page. Multiple H1s on a single page, or a missing H1, are both common and both worth fixing. Check also for skipped heading levels (jumping from H1 to H3 with no H2 in between) — these affect accessibility and indicate loose page structure.
Image alt text is one of the most consistently neglected on-page elements. Every meaningful image should have a descriptive alt attribute — it helps search engines understand image content and it's required for basic accessibility compliance. A site crawl will give you a complete list of images missing alt text across all pages.
On internal links: check not just that internal links exist, but that important pages are linked from multiple relevant pages with descriptive anchor text. A page linked from only one other page, or consistently linked with anchor text like "click here", is underserving its potential. Internal links are the easiest lever you control for distributing authority across your site.
Content audit: thin pages and duplicate content
Content issues are slower to fix than technical ones but often have more long-term impact on rankings. Two things to focus on.
Thin content — pages with very little substantive copy — rarely rank well for competitive queries. A contact page or a simple utility page can be short by nature. But a service page or product page with 80 words and nothing else is unlikely to satisfy a searcher's intent well enough to rank. Identify your thin pages and either build them out with genuinely useful content, consolidate them into a better page, or noindex them if they serve no organic search purpose.
Duplicate content — multiple pages covering essentially the same topic with the same or very similar copy — dilutes your site's authority across competing versions rather than concentrating it on one strong page. This often happens with product variants, location pages generated from a template, or old content that got republished without the original being removed. The fix is usually canonical tags pointing to the preferred version, or a 301 redirect consolidating traffic and link equity onto the authoritative page.
Tools for running a free SEO site audit
Google Search Console is free, essential, and the single most important tool for understanding how Google actually sees your site. The Coverage, Performance, and Core Web Vitals reports cover a significant portion of a technical audit by themselves. Set it up before anything else if it isn't already running.
Screaming Frog SEO Spider is the industry-standard desktop crawler. The free tier crawls up to 500 URLs — enough for a complete audit of most small sites. It's powerful and thorough, with a complex interface that takes some time to learn. Worth using if you're comfortable with a technical tool.
Tom's Site Auditor is the free website audit tool I built for this exact workflow — a Windows desktop crawler that runs entirely offline, checks 30+ issue types, and generates a self-contained HTML report you can open in any browser or share with a client or developer. The free trial crawls up to 10,000 pages with no time limit, which covers most small business sites completely. It's a one-time purchase rather than a subscription. The user guide covers interpreting every issue type the tool flags.
PageSpeed Insights (pagespeed.web.dev) for speed and Core Web Vitals, Bing Webmaster Tools for a second data source on indexing and keywords, and Google's Rich Results Test for structured data validation round out the free toolkit. Between these and Search Console, you can run a thorough free SEO site audit without spending anything.
What to fix first — prioritising your findings
A full audit on a site that hasn't been checked in a while will surface more issues than you can fix in an afternoon. That's normal. The discipline is working through them in the right order rather than trying to address everything at once.
Fix crawlability blockers first — anything preventing Google from reaching or indexing pages that should be indexed. These have the highest impact and are usually quick to resolve once you know they exist. Broken links, missing titles, and duplicate page issues come next. These affect rankings directly and tend to be straightforward to fix. Work through alt text, thin content, and meta description issues after that — important, but not urgent.
The HTML report generated by Tom's Site Auditor organises findings by issue type and severity, which makes this triage straightforward. If you have the paid bundle, the included Site Fixer tool can apply bulk fixes for common issues — missing meta descriptions, alt text, canonical tags — directly on your server files.
Tracking results after an audit
After working through your fixes, give Google a few weeks to re-crawl the affected pages before drawing conclusions. Search Console's Performance report is the right place to track organic clicks, impressions, and average position over time. Set a baseline from the week before you started fixing, then compare at the 30 and 60-day marks.
The metrics worth watching: organic clicks and impressions (are more pages appearing in results?), average position for your target keywords, and the Coverage report (are previously excluded pages now indexed?). Don't expect overnight changes — Google re-crawls at its own pace and ranking shifts take time to reflect in the data.
The more important habit is making audits a regular part of your workflow rather than a one-time task. A quarterly crawl catches regressions before they compound. Running one after any significant site change — a redesign, a content restructure, adding a new section — catches problems while they're still fresh and easy to fix.
If you use the keyword mining feature in Tom's Site Auditor, you can track position changes for specific keywords over time alongside your audit data — useful for connecting technical fixes to ranking movements.
Run your first audit today
The process outlined here covers everything that matters for a small business site: crawlability, technical signals, on-page elements, content quality, and ongoing tracking. None of it requires paid subscriptions or agency involvement. What it requires is running the tools and working through the findings methodically.
If you haven't audited your site recently, the most useful thing you can do right now is run a crawl and see what comes back. Tom's Site Auditor is free to try on sites up to 10,000 pages — download it, point it at your domain, and you'll have a complete issue list in a few minutes.