Posted by 4 Comments SEO

seo audit check listHaving worked in the SEO industry for several years, it’s safe to say I’ve conducted my fair share of SEO audits for clients’ websites. From small websites with fewer than a hundred pages to blue-chip organisations’ websites with several thousand pages, I’ve seen it all – and remained a “white hat” SEO all the way!

Over the years I’ve compiled a comprehensive check list that covers crucial aspects of a technical SEO audit. Different SEOs will plan the technical SEO audit differently, and often a full audit of a particular site isn’t needed because they’re getting it right in some areas.

One of the main aspects that people do get wrong is the accessibility of their website’s content. If Google can’t get to your content, neither can a user so you’ll be suffering a double hit of lack of visits and conversions.

Whether it’s your own site or a client’s , make sure it’s connected to a Google Webmaster Tools account and if you or your client doesn’t have Google Webmaster Tools set up, well, that’s the first issue to resolve.

Following are the key areas of a website I look at when I conduct a website accessibility audit.

 

#1 Error Handling

By logging in to your Google Webmaster Tools account you can find out if Google’s crawlers have come across any mistyped URLs; pages that don’t exist sending back a ‘not found’ (404) error page or soft 404 errors, or if your server is rejecting crawlers’ access to some pages and sending back ‘500’Server errors or any other type of accessibility issues.

If Google has reported any issues while accessing your site, do not ignore them!  Make sure you address them ASAP by putting affective redirects in place for all the broken links and investigate the reason for server errors.

 

#2 Robots.txt

Robots.txt is a text file that is normally positioned at the root of a website. Review the robots.txt file of your site to check whether it is blocking search engine crawlers’ access to important pages.

t is also crucial not to expose secure pages and directories from the site, i.e shopping basket or login page, so the search engines don’t crawl the page and pages that carry no extra value for search users; instead they are populating a search engine’s database with junk pages.

 

#3 Robots Meta Tag

Search engine crawlers can be further guided on a page level using Robots Meta tags.

Customise the robots meta tag for the different areas of the website, where some pages are not to be indexed, or not to be followed.

It is recommend using a robots meta tag if you have the odd page in indexed folders that you want to block, but generally, if most of your non-indexed content is in one or more folders, then use robots.txt to block the entire folder. An example of a robots meta tagis:

<meta name=”robots” content=”noindex, nofollow” />

 

#4 Nofollow, Noindex Tag

Use the nofollow tag for links on pages that don’t hold much value, like review form pages. use “nofollow” tags to prevent the passing of page rank through a particular link to the linked-to page. 

You can also use noindex on these same pages. Use of the noindex tag prevents a search engine robot from indexing a particular page on a site.

 

#5 XML Sitemap

The sitemap is a text file that carries a complete map of all URLs that exist on a website. This helps search engine crawlers to index all of the website’s pages more efficiently as it contains useful information like the desired crawl frequency of pages; the date the XML file was last modified and shows which pages have recently changed and set appropriate priorities for all the pages.

Ensure that there is an XML sitemap present at the root of the website and submitted into Google and Bing webmaster tools to help them crawl your website. XML site maps are even more useful and a ‘must have’ when the website is large.

 

#6 Latency (page load speed)

There can be latency issues if the HTML code is not efficient enough, or if the pages are not compressed. Efficient code and compression will allow the website to load faster.

Test your website’s loading speed by using tools like gtmetrix.com and Google’s Page speed test tools. These are FREE to use and not only give you a list page speed issues with your website but also the possible solution to resolve them.

 

#7 Content Duplication

One of the fundamental causes of poor organic search performance is due to the presence of duplicate content on a website. Pages that present the same content to search engines via multiple URLs cause content duplication. Any difference between one URL to another (even seemingly minor differences such as a capitalised letter) will cause search engines to consider these URLs as separate pages.

There are many types of duplication issues that can exist in a site and I’ve listed them as separate points below to look out for and address:

 

#8 Duplication: Click and session tracking

Websites use sessions to track customer behaviour on the site, particularly ecommerce websites that like to store visitors’ shopping cart activities, etc. So when they return to the site they can see items they have included in the shopping cart previously. Some websites do this by using Session IDs in the URL. By doing this, every internal link on the website gets that Session ID appended to the URL, and because that Session ID is unique to that session, it creates a new URL, and thus duplicate content.

Avoid any duplication by adding parameters in a URL to track clicks or open sessions. Try to use cookies for storing sessions, if it is not possible then make sure URLs are redirected or canonicalised to their non-session ID versions of URL.

 

#9 Duplication: Architecture

Duplication can occur when a product is placed into more than one category, and the architecture of the website is not correctly structured for multiple category products.

 

#10 Duplication: HTTP vs. HTTPS

A website can be duplicated through secured HTTPS sessions; HTTPS pages are often used for payment transactions, logins and shopping baskets to provide an encrypted and secure connection. Make sure to use HTTPS on pages that require this security protocol and using the canonical tag to refer to its non-https version of the URL.

 

#11 Duplication: Capitalisation

Capitalised URLs can create duplication issues if the web browser resolves them. They should be redirected to the correct version of the URL.

 

#12 Redirects: 301

301 redirects to be used when a page becomes irrelevant – i.e. a discontinued product – to be permanently redirected to a category page, or a new, similar product. The 301redirect passes the link credibility (link juice) to the redirecting destination.

 

#13 Redirects: 302

The 302 redirects are only to be used for temporary redirects. The 302 redirect doesn’t pass any link credibility to the redirecting destination.

 

#14 Redirects: Chained Redirects

I often come across websites that have multiple redirects, where page A redirected to page B and page B is then redirected to page C. Avoid this kind of chain redirects to ensure maximum link credibility from one URL to the redirected URL.

 

#15 Canonical Tag

If there is duplicate content on the website that cannot be redirected, use the rel=canonical tag on pages that require it. 

 

#16 Pagination

Pagination occurs when a website divides its content into multiple pages, for example an ecommerce site displaying its product or category listing into multiple pages.

Ensure pagination has been implemented correctly so that it’s not causing any duplication. Google recommends implementing rel=“next” and rel=“prev” for pagination. 

 

#17 URL Structure

To get the best search and usability performance from you website, keep its structure sound, avoiding superfluous folders, and keeping the architecture as flat as possible. When it comes to subfolders, search engines assume that content that lives in folders far away from the root domain is less important.

My suggestion would be to limit folder depth to three folders, also keeping all the important content within three clicks away from the home page.

 

For example:

www.example.com/category/product-A

www.example.com/category/sub-category/sub-sub-category/product-B

In the example above, product A will be getting much more credibility compared to product B

 

#18 HTML Validation

Websites should be cleaned and have corrected XHTML formatting applied according to the recommended standards. To check whether your website is up the standard, you can use this FREE Tool validator.w3.org. 

 

#19 Links: Nofollow

Use rel=”nofollow” tag when a link is not to be considered as valuable, i.e. in comments, so the link strength is not diluted

 

#20 Links: JavaScript

Although Google now has the ability to read most JavaScripts, I would still suggest it be avoided, as they are not as easily followed as HTML links.

 

#21 Links: Image links

Search engines can’t see images, so it is important we use the alt attribute to provide descriptive text about the image. Use alt text for images, describing the link with relevant keywords.

 

#22 Flash, JavaScript and Frames

Ensure that the website doesn’t make too much use of Flash and Javascript. Crawlers find them difficult to read and understand which prevents crawlers from effectively crawling the website. Similarly iFrames should be avoided.

 

#23 Customised 404 Page

The 404 (broken link) page often gets ignored by the webmasters. They don’t realise that a good 404 page is perhaps as important as having great content. Sometimes it may not be your fault visitors land on error pages, but being able to communicate the error professionally is as good as a second chance to re-engage a visitor. That is why it is recommended to have a branded 404 page that not only links back to the homepage but has other engaging features like a search option.

 

#24 Structured Data (Rich Snippet)

Structured data allows search engines to provide users with relevant related content through its various vertical markets using different data types. Make the most of structured data for improved click through in the SERP.

Structured Data or rich snippets can be used on the following elements of a website:

Event, Person, Place, LocalBusiness, Product, Offer and Review etc.

My colleague recently wrote an article about how you can use rich snippets to get visits. They make your listings in the search results look much more enticing and can actually poach clicks from the number one spot!

Google Webmaster Tools has introduced a new feature called ‘Data Highlighter’ (under their optimisation tab), which allows you set up Structured Data on events on your site, without you having any prior coding knowledge.

At Receptional, our core concern is to enhance the performance of your site in search. We’ve helped many clients improve the structure of their sites and conversion rates, so get in touch with us and we’ll start helping you today!