ARTICLE AD BOX
A method SEO audit analyzes the method aspects of a website related to hunt motor optimization. It ensures hunt engines similar Google tin crawl, index, and fertile pages connected your site.
In a method SEO audit, you'll look astatine (and fix) things that could:
- Slow down your site
- Make it hard for hunt engines to recognize your content
- Make it hard for your pages to look successful hunt results
- Affect however users interact with your tract connected antithetic devices
- Impact your site's security
- Create duplicate contented issues
- Cause navigation problems for users and hunt engines
- Prevent important pages from being found
Identifying and fixing specified method issues assistance hunt engines amended recognize and fertile your content. Which tin mean improved integrated hunt visibility and postulation implicit time.
How to Perform a Technical SEO Audit
You’ll request 2 main tools for a method tract audit:
- Google Search Console
- A crawl-based tool, similar Semrush’s Site Audit
If you haven't utilized Search Console before, cheque retired our beginner's guide. We’ll sermon the tool’s assorted reports below.
And if you’re caller to Site Audit, sign up for free relationship to travel on with this guide.
The Site Audit instrumentality scans your website and provides information astir each leafage it crawls. The study it generates shows you a assortment of method SEO issues.
In a dashboard similar this:

To acceptable up your archetypal crawl, make a project.

Next, caput to the Site Audit instrumentality and prime your domain.

The “Site Audit Settings” model volition popular up. Here, configure the basics of your archetypal crawl. Follow this detailed setup guide for help.

Finally, click “Start Site Audit.”

After the instrumentality crawls your site, it generates an overview of your site's health.

This metric grades your website wellness connected a standard from 0 to 100. And however you comparison with different sites successful your industry.
Your tract issues are ordered by severity done the “Errors,” “Warnings,” and “Notices” categories. Or absorption connected circumstantial areas of method SEO with “Thematic Reports.”

Toggle to the “Issues” tab to spot a implicit database of each tract issues. Along with the fig of affected pages.

Each contented includes a “Why and however to hole it” link.

The issues you find present volition acceptable into 1 of 2 categories, depending connected your accomplishment level:
- Issues you tin hole connected your own
- Issues a developer oregon strategy head mightiness request to assistance you fix
Conduct a method SEO audit connected immoderate caller tract you enactment with. Then, audit your tract astatine slightest erstwhile per 4th (ideally monthly). Or whenever you spot a diminution successful rankings.
1. Spot and Fix Crawlability and Indexability Issues
Crawlability and indexability are a important facet of SEO. Because Google and different hunt engines indispensable beryllium capable to crawl and scale your webpages successful bid to fertile them.
Google's bots crawl your tract by pursuing links to find pages. They work your contented and codification to recognize each page.
Google past stores this accusation successful its index—a monolithic database of web content.
When idiosyncratic performs a Google search, Google checks its scale to instrumentality applicable results.

To cheque if your tract has immoderate crawlability oregon indexability issues, spell to the “Issues” tab successful Site Audit.
Then, click “Category” and prime “Crawlability.”

Repeat this process with the “Indexability” category.
Issues connected to crawlability and indexability volition often beryllium astatine the apical of the results successful the “Errors” section. Because they’re often much serious. We'll screen respective of these issues.

Now, let’s look astatine 2 important website files—robots.txt and sitemap.xml—that person a immense interaction connected however hunt engines observe your site.
Spot and Fix Robots.txt Issues
Robots.txt is simply a website substance record that tells hunt engines which pages they should oregon shouldn’t crawl. It tin usually beryllium recovered successful the basal folder of the site: https://domain.com/robots.txt.
A robots.txt record helps you:
- Point hunt motor bots distant from backstage folders
- Keep bots from overwhelming server resources
- Specify the determination of your sitemap
A azygous enactment of codification successful robots.txt tin forestall hunt engines from crawling your full site. Make definite your robots.txt record doesn't disallow immoderate folder oregon leafage you privation to look successful hunt results.
To cheque your robots.txt file, unfastened Site Audit and scroll down to the “Robots.txt Updates” container astatine the bottom.

Here, you'll spot if the crawler has detected the robots.txt record connected your website.
If the record presumption is “Available,” reappraisal your robots.txt record by clicking the nexus icon adjacent to it.
Or, absorption lone connected the robots.txt record changes since the past crawl by clicking the “View changes” button.

Further reading: Reviewing and fixing the robots.txt record requires method knowledge. Always travel Google's robots.txt guidelines. Read our usher to robots.txt to larn astir its syntax and champion practices.
To find further issues, unfastened the “Issues” tab and hunt “robots.txt.”

Some issues include:
- Robots.txt record has format errors: Your robots.txt record mightiness person mistakes successful its setup. This could accidentally artifact important pages from hunt engines oregon let entree to backstage contented you don't privation shown.
- Sitemap.xml not indicated successful robots.txt: Your robots.txt record doesn't notation wherever to find your sitemap. Adding this accusation helps hunt engines find and recognize your tract operation much easily.
- Blocked interior resources successful robots.txt: You mightiness beryllium blocking important files (like CSS oregon JavaScript) that hunt engines request to decently presumption and recognize your pages. This tin wounded your hunt rankings.
- Blocked outer resources successful robots.txt: Resources from different websites that your tract uses (like CSS, JavaScript, and representation files) mightiness beryllium blocked. This tin forestall hunt engines from afloat knowing your content.
Click the nexus highlighting the recovered issues.

Inspect them successful item to larn however to hole them.

Further reading: Besides the robot.txt file, determination are 2 different ways to supply instructions for hunt motor crawlers: the robots meta tag and x-robots tag. Site Audit volition alert you of issues related to these tags. Learn however to usage them successful our guide to robots meta tags.
Spot and Fix XML Sitemap Issues
An XML sitemap is simply a record that lists each the pages you privation hunt engines to scale and rank.
Review your XML sitemap during each method SEO audit to guarantee it includes each pages you privation to rank.
Also cheque that the sitemap doesn’t see pages you don’t privation successful the SERPs. Like login pages, lawsuit relationship pages, oregon gated content.
Next, cheque whether your sitemap works correctly.
The Site Audit instrumentality tin observe communal sitemap-related issues, specified as:
- Format errors: Your sitemap has mistakes successful its setup. This could confuse hunt engines, causing them to disregard your sitemap entirely.
- Incorrect pages found: You've included pages successful your sitemap that shouldn't beryllium there, similar duplicate contented oregon mistake pages. This tin discarded your crawl fund and confuse hunt engines.
- File is excessively large: Your sitemap is bigger than hunt engines prefer. This mightiness pb to incomplete crawling of your site.
- HTTP URLs successful sitemap.xml for HTTPS site: Your sitemap lists unsecure versions of your pages connected a unafraid site. This mismatch could mislead hunt engines.
- Orphaned pages: You've included pages successful your sitemap that aren't linked from anyplace other connected your site. This could discarded the crawl fund connected perchance outdated oregon unimportant pages.
To find and hole these issues, spell to the “Issues” tab and benignant “sitemap” successful the hunt field:

You tin besides usage Google Search Console to place sitemap issues.
Visit the “Sitemaps” study to submit your sitemap to Google, presumption your submission history, and reappraisal immoderate errors.
Find it by clicking “Sitemaps” nether the “Indexing” section.

If you spot “Success” listed adjacent to your sitemap, determination are nary errors. But the different 2 statuses—“Has errors” and “Couldn’t fetch”—indicate a problem.

If determination are issues, the study volition emblem them individually. Follow Google's troubleshooting guide to hole them.
Further reading: If your tract doesn't person a sitemap.xml file, work our usher connected how to make an XML sitemap.
2. Audit Your Site Architecture
Site architecture refers to the hierarchy of your webpages and however they are connected done links. Organize your website truthful it’s logical for users and casual to support arsenic your website grows.
Good tract architecture is important for 2 reasons:
- It helps hunt engines crawl and recognize the relationships betwixt your pages
- It helps users navigate your site
Let's see 3 cardinal aspects of tract architecture. And however to analyse them with the method SEO audit tool.
Site Hierarchy
Site hierarchy (or tract structure) is however your pages are organized into subfolders.
To recognize site's hierarchy, navigate to the “Crawled Pages” tab successful Site Audit.

Then, power the presumption to “Site Structure.”

You’ll spot your website’s subdomains and subfolders. Review them to marque definite the hierarchy is organized and logical.
Aim for a level tract architecture, which looks similar this:

Ideally, it should lone instrumentality a idiosyncratic 3 clicks to find the leafage they privation from your homepage.
When it takes much than 3 clicks to navigate your site, its hierarchy is excessively deep. Search engines see pages heavy successful the hierarchy to beryllium little important oregon applicable to a hunt query.
To guarantee each your pages fulfill this requirement, enactment wrong the “Crawled Pages” tab and power backmost to the “Pages” view.

Then, click “More filters” and prime the pursuing parameters: “Crawl Depth” is “4+ clicks.”

To hole this issue, adhd interior links to pages that are excessively heavy successful the site’s structure.
Navigation
Your site's navigation (like menus, footer links, and breadcrumbs) should marque it easier for users to navigate your site.
This is an important pillar of bully website architecture.
Your navigation should be:
- Simple. Try to debar mega menus oregon non-standard names for paper items (like “Idea Lab” alternatively of “Blog”)
- Logical. It should bespeak the hierarchy of your pages. A large mode to execute this is to usage breadcrumbs.
Breadcrumbs are a secondary navigation that shows users their existent determination connected your site. Often appearing arsenic a enactment of links astatine the apical of a page. Like this:

Breadcrumbs assistance users recognize your tract operation and easy determination betwixt levels. Improving some idiosyncratic acquisition and SEO.
No instrumentality tin assistance you make user-friendly menus. You request to reappraisal your website manually and travel UX champion practices for navigation.
URL Structure
Like a website’s hierarchy, a site’s URL operation should beryllium accordant and casual to follow.
Let's accidental a website visitant follows the paper navigation for girls’ shoes:
Homepage > Children > Girls > Footwear
The URL should reflector the architecture: domain.com/children/girls/footwear
Some sites should besides see utilizing a URL operation that shows a leafage oregon website is applicable to a circumstantial country. For example, a website for Canadian users of a merchandise whitethorn usage either “domain.com/ca” oregon “domain.ca.”
Lastly, marque definite your URL slugs are user-friendly and travel champion practices.
Site Audit identifies communal issues with URLs, specified as:
- Use of underscores successful URLs: Using underscores (_) alternatively of hyphens (-) successful your URLs tin confuse hunt engines. They mightiness spot words connected by underscores arsenic a azygous word, perchance affecting your rankings. For example, "blue_shoes" could beryllium work arsenic "blueshoes" alternatively of "blue shoes".
- Too galore parameters successful URLs: Parameters are URL elements that travel aft a question mark, similar "?color=blue&size=large". They assistance with tracking. Having excessively galore tin marque your URLs agelong and confusing, some for users and hunt engines.
- URLs that are excessively long: Some browsers mightiness person occupation processing URLs that transcend 2,000 characters. Short URLs are besides easier for users to retrieve and share.

3. Fix Internal Linking Issues
Internal links constituent from 1 leafage to different wrong your domain.
Internal links are an indispensable portion of a bully website architecture. They administer nexus equity (also known arsenic “link juice” oregon “authority”) crossed your site. Which helps hunt engines place important pages.
As you amended your site’s structure, cheque the wellness and presumption of its interior links.
Refer backmost to the Site Audit study and click “View details” nether your “Internal Linking” score.

In this report, you’ll spot a breakdown of your site's interior nexus issues.

Broken interior links—links that constituent to pages that nary longer exist—are a common interior linking mistake. And are reasonably casual to fix.
Click the fig of issues successful the “Broken interior links” mistake connected your “Internal Link Issues” report. And manually update the breached links successful the list.

Another casual hole is orphaned pages. These are pages with nary links pointing to them. Which means you can’t summation entree to them via immoderate different leafage connected the aforesaid website.
Check the “Internal Links” barroom graph to look for pages with zero links.

Add astatine slightest 1 interior nexus to each of these pages.
Use the “Internal Link Distribution” graph to spot the organisation of your pages according to their Internal LinkRank (ILR).
ILR shows however beardown a leafage is successful presumption of interior linking. The person to 100, the stronger a page.

Use this metric to larn which pages could payment from further interior links. And which pages you tin usage to administer much nexus equity crossed your domain.
But don’t proceed fixing issues that could person been avoided. Follow these interior linking champion practices to debar issues successful the future:
- Make interior linking portion of your contented instauration strategy
- Every clip you make a caller page, nexus to it from existing pages
- Don’t nexus to URLs that person redirects (link to the redirect destination instead)
- Link to applicable pages and usage applicable anchor text
- Use interior links to amusement hunt engines which pages are important
- Don't usage excessively galore interior links (use communal consciousness here—a modular blog station apt doesn't request 50 interior links)
- Learn astir nofollow attributes and usage them correctly
4. Spot and Fix Duplicate Content Issues
Duplicate content means aggregate webpages incorporate identical oregon astir identical content.
It tin pb to respective problems, including:
- SERPs displaying an incorrect mentation of your page
- The astir applicable pages not performing good successful SERPs
- Indexing problems connected your site
- Splitting your leafage authorization betwixt duplicate versions
- Increased trouble successful tracking your content's performance
Site Audit flags pages arsenic duplicate contented if their contented is astatine slightest 85% identical.

Duplicate contented tin hap for 2 communal reasons:
- There are aggregate versions of URLs
- There are pages with antithetic URL parameters
Multiple Versions of URLs
For example, a tract whitethorn have:
- An HTTP version
- An HTTPS version
- A www version
- A non-www version
For Google, these are antithetic versions of the site. So if your leafage runs connected much than 1 of these URLs, Google considers it a duplicate.
To hole this issue, prime a preferred mentation of your tract and acceptable up a sitewide 301 redirect. This volition guarantee lone 1 mentation of each leafage is accessible.
URL Parameters
URL parameters are other elements of a URL utilized to filter oregon benignant website content. They're commonly utilized for merchandise pages with flimsy changes (e.g., antithetic colour variations of the aforesaid product).
You tin place them due to the fact that by the question people and adjacent sign.

Because URLs with parameters person astir the aforesaid contented arsenic their counterparts without parameters, they tin often beryllium identified arsenic duplicates.
Google usually groups these pages and tries to prime the champion 1 to usage successful hunt results. Google volition typically place the astir applicable mentation of the leafage and show that successful hunt results—while consolidating ranking signals from the duplicate versions.
Nevertheless, Google recommends these actions to trim imaginable problems:
- Reduce unnecessary parameters
- Use canonical tags pointing to the URLs with nary parameters
Avoid crawling pages with URL parameters erstwhile mounting up your SEO audit. To guarantee the Site Audit instrumentality lone crawls pages you privation to analyze—not their versions with parameters.
Customize the “Remove URL parameters” conception by listing each the parameters you privation to ignore:

To entree these settings later, click the settings (gear) icon successful the top-right corner, past click “Crawl sources: Website” nether the Site Audit settings.

5. Audit Your Site Performance
Site velocity is simply a important facet of the wide leafage acquisition and has agelong been a Google ranking factor.
When you audit a tract for speed, see 2 information points:
- Page speed: How agelong it takes 1 webpage to load
- Site speed: The mean leafage velocity for a illustration acceptable of leafage views connected a site
Improve leafage speed, and your tract velocity improves.
This is specified an important task that Google has a instrumentality specifically made to code it: PageSpeed Insights.

A fistful of metrics power PageSpeed scores. The 3 astir important ones are called Core Web Vitals.
They include:
- Largest Contentful Paint (LCP): measures however accelerated the main contented of your leafage loads
- Interaction to Next Paint (INP): measures however rapidly your leafage responds to idiosyncratic interactions
- Cumulative Layout Shift (CLS): measures however visually unchangeable your leafage is

PageSpeed Insights provides details and opportunities to amended your leafage successful 4 main areas:
- Performance
- Accessibility
- Best Practices
- SEO

But PageSpeed Insights tin lone analyse 1 URL astatine a time. To get the sitewide view, usage Semrush's Site Audit.
Head to the “Issues” tab and prime the “Site Performance” category.
Here, you tin spot each the pages a circumstantial contented affects—like dilatory load speed.

There are besides 2 elaborate reports dedicated to performance—the “Site Performance” study and the “Core Web Vitals” report.
Access some from the Site Audit Overview.

The “Site Performance” study provides an further “Site Performance Score.” Or a breakdown of your pages by their load velocity and different utile insights.

The Core Web Vitals study volition interruption down your Core Web Vitals metrics based connected 10 URLs. Track your show implicit clip with the “Historical Data” graph.
Or edit your database of analyzed pages truthful the study covers assorted types of pages connected your tract (e.g., a blog post, a landing page, and a merchandise page).
Click “Edit list” successful the “Analyzed Pages” section.

Further reading: Site show is simply a wide taxable and 1 of the astir important aspects of method SEO. To larn much astir the topic, cheque retired our page velocity guide, arsenic good arsenic our elaborate usher to Core Web Vitals.
6. Discover Mobile-Friendliness Issues
As of January 2024, much than fractional (60.08%) of web postulation happens connected mobile devices.
And Google chiefly indexes the mobile mentation of each websites implicit the desktop version. (Known arsenic mobile-first indexing.)
So guarantee your website works perfectly connected mobile devices.
Use Google’s Mobile-Friendly Test to rapidly cheque mobile usability for circumstantial URLs.
And usage Semrush to cheque 2 important aspects of mobile SEO: viewport meta tag and AMPs.
Just prime the “Mobile SEO” class successful the “Issues” tab of the Site Audit tool.

A viewport meta tag is an HTML tag that helps you standard your leafage to antithetic surface sizes. It automatically alters the leafage size based connected the user’s instrumentality erstwhile you person a responsive design.
Another mode to amended the tract show connected mobile devices is to usage Accelerated Mobile Pages (AMPs), which are stripped-down versions of your pages.
AMPs load rapidly connected mobile devices due to the fact that Google runs them from its cache alternatively than sending requests to your server.
If you usage AMPs, audit them regularly to marque definite you’ve implemented them correctly to boost your mobile visibility.
Site Audit volition trial your AMPs for assorted issues divided into 3 categories:
- AMP HTML issues
- AMP benignant and layout issues
- AMP templating issues
7. Spot and Fix Code Issues
Regardless of what a webpage looks similar to quality eyes, hunt engines lone spot it arsenic a clump of code.
So, it’s important to usage due syntax. And applicable tags and attributes that assistance hunt engines recognize your site.
During your method SEO audit, show antithetic parts of your website codification and markup. Including HTML (which includes assorted tags and attributes), JavaScript, and structured data.
Let’s excavation into these.
Meta Tag Issues
Meta tags are substance snippets that supply hunt motor bots with further information astir a page’s content. These tags are contiguous successful your page’s header arsenic a portion of HTML code.
We've already covered the robots meta tag (related to crawlability and indexability) and the viewport meta tag (related to mobile-friendliness).
You should recognize 2 different types of meta tags:
- Title tag: Indicates the rubric of a page. Search engines usage rubric tags to signifier the clickable bluish nexus successful the hunt results. Read our guide to rubric tags to larn more.
- Meta description: A little statement of a page. Search engines usage it to signifier the snippet of a leafage successful the hunt results. Although not straight tied to Google’s ranking algorithm, a well-optimized meta statement has different imaginable SEO benefits similar improving click-through rates and making your hunt effect basal retired from competitors.

To spot issues related to meta tags successful your Site Audit report, prime the “Meta tags” class successful the “Issues” tab.

Here are immoderate communal meta tag issues you mightiness find:
- Missing rubric tags: A leafage without a rubric tag whitethorn beryllium seen arsenic debased prime by hunt engines. You're besides missing an accidental to archer users and hunt engines what your leafage is about.
- Duplicate rubric tags: When aggregate pages person the aforesaid title, it's hard for hunt engines to find which leafage is astir applicable for a hunt query. This tin wounded your rankings.
- Title tags that are excessively long: If your rubric exceeds 70 characters, it mightiness get chopped disconnected successful hunt results. This looks unappealing and mightiness not convey your afloat message.
- Title tags that are excessively short: Titles with 10 characters oregon little don't supply capable accusation astir your page. This limits your quality to fertile for antithetic keywords.
- Missing meta descriptions: Without a meta description, hunt engines mightiness usage random substance from your leafage arsenic the snippet successful hunt results. This could beryllium unappealing to users and trim click-through rates.
- Duplicate meta descriptions: When aggregate pages person the aforesaid meta description, you're missing chances to usage applicable keywords and differentiate your pages. This tin confuse some hunt engines and users.
- Pages with a meta refresh tag: This outdated method tin origin SEO and usability issues. Use due redirects instead.
Canonical Tag Issues
Canonical tags are utilized to constituent retired the “canonical” (or “main”) transcript of a page. They archer hunt engines which leafage needs to beryllium indexed successful lawsuit determination are aggregate pages with duplicate oregon akin content.
A canonical URL tag is placed successful the <head> conception of a page's codification and points to the “canonical” version.
It looks similar this:
<link rel="canonical" href="https://www.domain.com/the-canonical-version-of-a-page/" />
A communal canonicalization contented is that a leafage has either nary canonical tag oregon aggregate canonical tags. Or, of course, a breached canonical tag.
The Site Audit instrumentality tin observe each of these issues. To lone spot the canonicalization issues, spell to “Issues” and prime the “Canonicalization” class successful the apical filter.

Common canonical tag issues include:
- AMPs with nary canonical tag: If you person some AMP and non-AMP versions of a page, missing canonical tags tin pb to duplicate contented issues. This confuses hunt engines astir which mentation to amusement successful the results.
- No redirect oregon canonical to HTTPS homepage from HTTP version: When you person some HTTP and HTTPS versions of your homepage without due direction, hunt engines conflict to cognize which 1 to prioritize. This tin divided your SEO efforts and wounded your rankings.
- Pages with a breached canonical link: If your canonical tag points to a non-existent page, you're wasting the crawl fund and confusing hunt engines.
- Pages with aggregate canonical URLs: Having much than 1 canonical tag connected a leafage gives conflicting directions. Search engines mightiness disregard each of them oregon prime the incorrect one, perchance hurting your SEO results.
Hreflang Attribute Issues
The hreflang attribute denotes the people portion and connection of a page. It helps hunt engines service the close saltation of a leafage based connected the user’s determination and connection preferences.
If your tract needs to scope audiences successful much than 1 country, usage hreflang attributes successful <link> tags.
Like this:

To audit your hreflang annotations, spell to the “International SEO” thematic study successful Site Audit.

You’ll spot a broad overview of the hreflang issues connected your site:

And a elaborate database of pages with missing hreflang attributes connected the full fig of connection versions your tract has.

Common hreflang issues include:
- Pages with nary hreflang and lang attributes: Without these, hunt engines can't find the connection of your contented oregon which mentation to amusement users.
- Hreflang conflicts wrong leafage root code: Contradictory hreflang accusation confuses hunt engines. This tin pb to the incorrect connection mentation appearing successful hunt results.
- Issues with hreflang values: Incorrect state oregon connection codes successful your hreflang attributes forestall hunt engines from decently identifying the people assemblage for your content. This tin pb to your pages being shown to the incorrect users.
- Incorrect hreflang links: Broken oregon redirecting hreflang links marque it hard for hunt engines to recognize your site's connection structure. This tin effect successful inefficient crawling and improper indexing of your multilingual content.
- Pages with hreflang connection mismatch: When your hreflang tag doesn't lucifer the existent connection of the page, it's similar mendacious advertising. Users mightiness onshore connected pages they can't understand.
Fixing these issues helps guarantee that your planetary assemblage sees the close contented successful hunt results. Which improves idiosyncratic acquisition and perchance boosts your planetary SEO ROI.
JavaScript Issues
JavaScript is simply a programming connection utilized to make interactive elements connected a page.
Search engines similar Google usage JavaScript files to render the page. If Google can’t get the files to render, it won’t scale the leafage properly.
The Site Audit instrumentality detects breached JavaScript files and flags the affected pages.

It tin besides amusement different JavaScript-related issues connected your website. Including:
- Unminified JavaScript and CSS files: These files incorporate unnecessary codification similar comments and other spaces. Minification removes this excess, reducing record size without changing functionality. Smaller files load faster.
- Uncompressed JavaScript and CSS files: Even aft minification, these files tin beryllium compressed further. Compression reduces record size, making them quicker to download.
- Large full size of JavaScript and CSS: If your combined JS and CSS files transcend 2 MB aft minification and compression, they tin inactive dilatory down your page. This ample size leads to mediocre UX and perchance little hunt rankings.
- Uncached JavaScript and CSS files: Without caching, browsers indispensable download these files each clip a idiosyncratic visits your site. This increases load clip and information usage for your visitors.
- Too galore JavaScript and CSS files: Using much than 100 files increases the fig of server requests, slowing down your leafage load time
- Broken outer JavaScript and CSS files: When files hosted connected different sites don't work, it tin origin errors connected your pages. This affects some idiosyncratic acquisition and hunt motor indexing.
Addressing these issues tin amended your site's performance, idiosyncratic experience, and hunt motor visibility.
To cheque however Google renders a leafage that uses JavaScript, spell to Google Search Console and usage the “URL Inspection Tool.”
Enter your URL into the apical hunt barroom and deed enter.

Then, trial the unrecorded mentation of the leafage by clicking “Test Live URL” successful the top-right corner. The trial whitethorn instrumentality a infinitesimal oregon two.
Now, you tin spot a screenshot of the leafage precisely however Google renders it. To cheque whether the hunt motor is speechmaking the codification correctly.
Just click the “View Tested Page” nexus and past the “Screenshot” tab.

Check for discrepancies and missing contented to find retired if thing is blocked, has an error, oregon times out.
Our JavaScript SEO guide tin assistance you diagnose and hole JavaScript-specific problems.
Structured Data Issues
Structured information is information organized successful a circumstantial codification format (markup) that provides hunt engines with further accusation astir your content.
One of the astir fashionable shared collections of markup connection among web developers is Schema.org.
Schema helps hunt engines scale and categorize pages correctly. And assistance you seizure SERP features (also known arsenic rich results).
SERP features are peculiar types of hunt results that basal retired from the remainder of the results owed to their antithetic formats. Examples see the following:
- Featured snippets
- Reviews
- FAQs

Use Google’s Rich Results Test instrumentality to cheque whether your leafage is eligible for affluent results.

Enter your URL to spot each structured information items detected connected your page.
For example, this blog station uses “Articles” and “Breadcrumbs” structured data.

The instrumentality volition database immoderate issues adjacent to circumstantial structured information items, on with links connected however to code them.
Or usage the “Markup” thematic study successful the Site Audit instrumentality to place structured information issues.
Just click “View details” successful the “Markup” container successful your audit overview.

The study volition supply an overview of each the structured information types your tract uses. And a database of immoderate invalid items.

Invalid structured information occurs erstwhile your markup doesn't travel Google's guidelines. This tin forestall your contented from appearing successful affluent results.
Click connected immoderate point to spot the pages affected.

Once you place the pages with invalid structured data, usage a validation instrumentality similar Google's Rich Results Test to hole immoderate errors.
Further reading: Learn much astir the “Markup” report and how to make schema markup for your pages.
8. Check for and Fix HTTPS Issues
Your website should beryllium utilizing an HTTPS protocol (as opposed to HTTP, which is not encrypted).
This means your tract runs connected a unafraid server utilizing an SSL certificate from a third-party vendor.
It confirms the tract is morganatic and builds spot with users by showing a padlock adjacent to the URL successful the web browser:

HTTPS is a confirmed Google ranking signal.
Implementing HTTPS is not difficult. But it tin bring astir immoderate issues. Here's however to code HTTPS issues during your method SEO audit:
Open the “HTTPS” study successful the Site Audit overview:

Here, you'll find a database of each issues connected to HTTPS. And proposal connected however to hole them.

Common issues include:
- Expired certificate: Your information certificate needs to beryllium renewed
- Old information protocol version: Your website is moving an aged SSL oregon TLS (Transport Layer Security) protocol
- No server sanction indication: Lets you cognize if your server supports SNI (Server Name Indication). Which allows you to big aggregate certificates astatine the aforesaid IP code to amended security
- Mixed content: Determines if your tract contains immoderate unsecure content, which tin trigger a “not secure” informing successful browsers
9. Find and Fix Problematic Status Codes
HTTP presumption codes bespeak a website server’s effect to the browser's petition to load a page.
1XX statuses are informational. And 2XX statuses study a palmy request. Don’t interest astir these.
Let’s reappraisal the different 3 categories—3XX, 4XX, and 5XX statuses. And however to woody with them.
Open the “Issues” tab successful Site Audit and prime the “HTTP Status” class successful the apical filter.

To spot each the HTTP presumption issues and warnings.
Click a circumstantial contented to spot the affected pages.
3XX Status Codes
3XX presumption codes bespeak redirects—instances erstwhile users and hunt motor crawlers onshore connected a leafage but are redirected to a caller page.
Pages with 3XX presumption codes are not ever problematic. However, you should ever guarantee they are utilized correctly to debar immoderate imaginable problems.
The Site Audit instrumentality volition observe each your redirects and emblem immoderate related issues.
The 2 astir communal redirect issues are arsenic follows:
- Redirect chains: When aggregate redirects beryllium betwixt the archetypal and last URL
- Redirect loops: When the archetypal URL redirects to a 2nd URL that redirects backmost to the original
Audit your redirects and travel the instructions provided wrong Site Audit to hole immoderate errors.
4XX Status Codes
4XX errors bespeak that a requested leafage can’t beryllium accessed. The astir communal 4XX mistake is the 404 error: Page not found.
If Site Audit finds pages with a 4XX status, region each the interior links pointing to those pages.
First, unfastened the circumstantial contented by clicking connected the corresponding fig of pages with errors.

You'll spot a database of each affected URLs.

Click “View breached links” successful each enactment to spot interior links that constituent to the 4XX pages listed successful the report.
Remove the interior links pointing to the 4XX pages. Or regenerate the links with applicable alternatives.
5XX Status Codes
5XX errors are connected the server side. They bespeak that the server could not execute the request. These errors tin hap for galore reasons.
Such as:
- The server being temporarily down oregon unavailable
- Incorrect server configuration
- Server overload
Investigate wherefore these errors occurred and hole them if possible. Check your server logs, reappraisal caller changes to your server configuration, and show your server's show metrics.
10. Perform Log File Analysis
Your website’s log record records accusation astir each idiosyncratic and bot that visits your site.
Log record investigation helps you look astatine your website from a web crawler's constituent of view. To recognize what happens erstwhile a hunt motor crawls your site.
It’s impractical to analyse the log record manually. Instead, usage Semrush’s Log File Analyzer.
You’ll request a transcript of your access log file to statesman your analysis. Access it connected your server’s record manager successful the power sheet oregon via an FTP (FileTransfer Protocol) client.
Then, upload the record to the instrumentality and commencement the analysis. The instrumentality volition analyse Googlebot enactment connected your tract and supply a report. That looks similar this:

It tin assistance you reply respective questions astir your website, including:
- Are errors preventing my website from being crawled fully?
- Which pages are crawled the most?
- Which pages are not being crawled?
- Do structural issues impact the accessibility of immoderate pages?
- How efficiently is my crawl budget being spent?
These answers substance your SEO strategy and assistance you resoluteness issues with the indexing oregon crawling of your webpages.
For example, if Log File Analyzer identifies errors that forestall Googlebot from afloat crawling your website, you oregon a developer tin enactment to resoluteness them.
To larn much astir the tool, work our Log File Analyzer guide.
Boost Your Website’s Rankings with a Technical SEO Audit
A thorough method SEO audit tin positively impact your website's integrated hunt ranking.
Now you cognize however to behaviour a method SEO audit, each you person to bash is get started.
Use our Site Audit tool to place and hole issues. And ticker your show amended implicit time.
This station was updated successful 2024. Excerpts from the archetypal nonfiction by A.J. Ghergich whitethorn remain.