What elements to analyze in a technical SEO audit

You may have heard about technical SEO audits, but you don’t know what elements to take into account when carrying out one. Or maybe you don’t know what a technical SEO audit is, but are curious to find out. Whatever your case, in this article we are going to explain what it is and what elements should be analyzed when carrying out an audit of this type.

What is a technical SEO audit?

A technical SEO audit consists of the analysis of certain SEO elements that are involved in the positioning of a website. The objective is to check the status of these elements in order to detect errors, points for improvement and opportunities that allow a positive evolution of the project.

The importance of carrying out a technical audit lies in the fact that, without it, it will be impossible to improve the results of a website. How are you going to improve something if you don’t know what’s wrong?

Elements to analyze in a technical SEO audit

1. Sitemap.xml – What is it?

The sitemap of a website is an inventory where the URLs that we want the Google robot to index when it comes to visit our domain are collected. In other words, this file tells Google which pages to crawl and where to find them.

Pages that have not been submitted in the sitemap may get indexed, but they may not. And the pages that we have included, but that we really don’t want to be crawled by Google, will also be indexed and shown to users when they do a search. And this, my friend, can be a big problem.

Therefore, when carrying out a web audit, you must ensure that the list of included URLs contains exactly those URLs that you want to be shown to users when they make a query in a search engine.

2.Robotx.txt 

The robots.txt file tells Googlebot which URLs not to crawl or index. This is where you need to make sure that those URLs that you don’t want Google to access are properly blocked. To do this, you have to check if the “allow” and “disallow” commands are properly configured.

Also, you should be careful not to run into contradictions between sitemap.xml and robots.txt: don’t request crawling and indexing of a URL in sitemap.xml if you’re blocking it in robots.txt and vice versa.

3 . Site architecture

When performing a technical SEO audit you should check the architecture of the web. Keep in mind that crawling robots do not like scattered information or excessive depth, while they reward the logical ordering of information through the different folders of the site and the proximity of the content in number of clicks to the home page. . If your website contains many URLs more than 3 clicks away, you may have a problem.

4. Indexability 

To check which pages of your website are being shown to users when searching on Google, you will need to check it in a tool such as Google Search Console. There you can see if the indexed pages are those that should be.

If you miss an indexed page, you can check if it contains the noindex tag, since this tells Google that this page should not be crawled and indexed. If, on the other hand, your website has more indexed URLs than it should, you may want to review the robots.txt and meta robots tag.

5. Canonical tagging 

One of the things Google penalizes the most is duplicate content on one or more web pages. Therefore, when the content of one or more pages is identical or similar, it is necessary to use the “rel=canonical” tag. Thanks to this, Google can know what the main URL is and not penalize the website for duplicate content. When you do a technical SEO audit, you should check that this is in order.

6. Response codes 

The response codes are those that indicate the technical status of a page. Among the most common are the following:

  • Error 404: the page cannot be found
  • Code 301: the page has been permanently redirected

It is important that the response codes are in proper condition. For example, it is common to find 404 errors on sites due to URL changes over time. However, it is necessary to correct these errors for the website to work properly.

In addition, checking these codes is essential after a web migration or after important changes, since a bad redirect can lead to 404 errors and loss of domain authority, which penalizes positioning.

7. Meta titles and meta descriptions 

The meta titles and meta descriptions constitute the text that Google shows us on a page when performing a search. They serve to indicate to the user the theme of a specific page and to briefly persuade him to enter to see the content. In order to get the most out of these elements, it is important that they adapt to the characters indicated by Google.

Also, even if it is not relevant from a technical point of view, you should optimize your titles and meta descriptions as much as you can, including persuasive elements that invite readers to visit your page, thus increasing CTR.

8. Headings

The heading H1 is the title of a web page within the web page itself. This is what the content indicates to Google and, therefore, it is important that it is optimized: it must clearly reflect the theme of the article and adapt to the characters suggested by Google.

The rest of the content must follow a hierarchical structure, using subheadings H2, H3, Hn according to the relevance of the section.

9. Pictures 

Images have an alternative text (alt text) that should be covered by offering a brief description of the content of the image, since Google also positions images. Therefore, it is important that all relevant images have such a description.

In addition, it is also necessary that the images are optimized so that the pages load fast.

10. meta robots tag 

The meta robots tag is used to give Google guidelines on whether or not to index a particular page and whether or not to follow the links on a website.

The most common guidelines are as follows:

  • Index/noindex: the crawler is told if the page should be indexed or not.
  • Follow/nofollow tells the robot if the links on the page should be followed or not.

When you do a technical SEO audit, don’t forget to check that these tags are correctly implemented.

11.Thin content 

Pages that have little or poor quality content can be penalized by Google, since they are usually not interesting for users. You should check if there are indexed pages with little content and, if so, you can choose to deindex them or increase their content.

12. Load speed

The loading speed is the time it takes for the different elements of your website to be displayed to users and, in addition to being important to minimize the bounce rate, it is also important for Google, since, if the robot fails to obtain a response from the server at X time, the content of the pages may not be indexed.

13. Internal and external linking

Internal linking refers to the pages of a website that link to each other. For a correct positioning, it is necessary to check that there are no orphan pages and that they all have an optimal link.

In external linking it is used to indicate to Google what your reference sources are. Having an average number of external links is positive for your website, as long as they point to pages with a certain domain authority.

14. Responsiveness

For some time now, Google has rewarded those websites that have a version adapted to mobile phones, since otherwise the user will have difficulty consulting the content from these devices. Therefore, when you carry out a technical SEO audit, you must ensure that the content is displayed correctly on mobile devices and tablets.