Blog Home



01 Nov

SEO Audit: Most Common Technical Issues and How to Avoid Them

seo-auditEvery time I do SEO audits for corporate websites, I am surprised how many of the revealed problems could have been avoided at the earliest possible stage. Still, it is what makes it interesting for analyzing. In the article below I will provide you with the insights to the most common technical mistakes I encounter during such a challenging task as SEO-auditing.


Problem #1. The website is accessible both with www and without it

This is the first thing I check before getting down to thorough website analysis. From my practice almost every single website is accessible in both versions, either remaining with the same URL address or redirecting to the other version when typing in the domain name into the browser bar with or without www. This issue will badly affect the website’s search results – two identical versions create duplicated content, which is treated as a violation of Google Webmaster Guidelines.

How to fix it

Apply 301 redirect from one version, that is less preferable (has less indexed pages and backlinks), to the more preferable one using an .htaccess file.

For example:

From to or vice versa.

Problem #2. Robots.txt is missing or incorrect

The second most common and crucial issue that I often come across is a missing or incorrect robots.txt file with the wrong directives in it.

One day I even came across the website with a robots.txt file like this:

User-agent: *
Disallow: /

Precisely, it tells to all search engine crawlers NOT to visit any pages of the site; moreover, at the time an ongoing SEO campaign is running for the site. Can you imagine promoting a website that search engines do not know about? Me neither!

How to fix it

Always choose and use robots.txt directives carefully, make sure that you allow/disallow only required folders, files or file extensions.

Always test your robots.txt file with Google Webmaster Tools testing feature and see if it is done correctly and not affecting your site’s indexation badly.

Problem #3. Incorrect URL structure and syntax

Unfortunately, web-programmers mostly do not know/care about the URL structure of sites they create. Sometimes incorrect URLs are automatically generated by Content Management Systems (CMS), but in all of these cases it badly affects the websites online presence.

Here is the most common example of an illegible URL that I so often come across:

How to fix it

  • Use lower-case characters only;
  • Use hyphens;
  • Do not use underscore or empty spaces;
  • Avoid dynamic or session ID generation;
  • Follow the hierarchy of folders/files.

An example of a correct, memorable and readable URL:

Hopefully, by solving all of the aforesaid problems you will be able to improve your websites’ online presence, SERP, traffic and conversion rates.

Leave a Reply

Your email address will not be published. Required fields are marked *