Posted by 1 Comment Others

It’s only just over a week away from Christmas, snow has settled outside the Receptional offices and we’ve reached the final instalment of our Top Five Most Common SEO Issues as Seen by Receptional in 2011 list.

For any latecomers, the title will have given it away but we’ve been counting down the most common SEO problems we’ve encountered over the past year, both on clients websites and otherwise. The previous blogs posts are available to peruse, at your leisure:

Number Two – Website Structure & Hierarchy

Number Three – Page Titles and Meta Elements

Number Four – Content

Number Five – XML Sitemaps

And here we are – the number one SEO problem we’ve seen in 2011…..

Technical Problems

Okay, so it’s not so much a broad stroke as a whole bucket of paint here, but allow me to explain.

I’d planned for today’s blog to concentrate on one particular type of technical issue, page duplication, but a timely video from the head of Google’s webspam team Matt Cutts prompted me to write about technical issues more broadly.

The video in question was posted on Google’s Webmaster Central Channel, and features Mr Cutts answering a question about content vs. SEO best practise:

Google would appear to favour sites that are built by those who know all the SEO ‘best practice’ techniques (and follow your tips!). There must be many technically poor sites that actually have fantastic content – surely these shouldn’t be penalised?

To summarise the video – content trumps website construction. Mess up your website, commit all the SEO cardinal sins you like and Google will still try and get you indexed and ranking if your content is good enough.

It sounds fair enough, Cutts confirms poorly optimised sites will not receive any manual action penalties, and that Google will try to compensate if a website has good quality content, but still contains technical mistakes.

I’ve seen one or two blogs on the back on this video suggesting that businesses may no longer have a need for an SEO expert. A couple of counter points immediately spring out, the first that a good SEO company (yes, like Receptional) will include content in any SEO campaign. The second, and Cutts states himself later in the video, that SEO techniques like making such content is accessible, that it “absolutely can help”.

Mr Cutts seems to be bigging up Googlebot’s capabilities a way too much here.

“If you embed something in some binary that we have to extract, or we have to process some JavaScript to find the links, or work around if you have untitled on all your pages and we have to synthesize or guess a title, we’ll still try do that”

Yes, we all know content is important and it absolutely is vital to achieve good rankings, (see our blog on content earlier this week), but the operative word Cutts uses is ‘try’. The simple fact of the matter is that Googlebot is not yet so advanced that it will rank a site with perfect content but no links and poor SEO, over a site with average content but loads of links and perfect SEO. Good technical SEO is still essential for good website performance.

Technical problems

Don’t worry, Google will sort it

Cutts stresses that a site will not be penalised for not doing “every single thing right on the SEO checklist”, which is true, but who cares about penalties when a site doesn’t rank for anything anyway.

Categorically, a poorly constructed website with technical problems will not rank, regardless of content.

To jump slightly back on track, technical issues are the most common problem we encounter as an SEO agency. I’ll briefly cover some on the more regular offenders.

Duplication

I wrote about duplicate content earlier this week, but the problem rears its ugly head once again. This time I’m referring to the duplication of pages across a website. This is a very common problem and one that can easily spin out of control when creating content. We’ve seen many websites with multiple variations of their homepage, accessible with and without the www prefix, with /index and .html and more. E-commerce websites can have similar issues and end up presenting an infinite number of URL combinations from un-canonicalised product filtering (ala-Amazon).

Redirection

URLs often need to be redirected and they are often redirected badly. Some methods will not pass on the full link value from the old page to the new, and quite often redirection is a complex, highly technical procedure and mistakes can easily be made.

Robots exclusion

A websites robots.txt file is means for the webmaster to prevent specified web crawlers. Basic problems can exist if the file is created with errors, and can result in areas of a website are listed unwittingly and therefore with no change of being indexed.

Hosting

Unreliable website hosting can be a huge problem, causing websites to have large periods of downtime and returning 500 errors and DNS issues. Errors such as this can be taken as a negative signal which could affect the overall quality assessment of a website.

Site Speed

Google have taken a very public stance over the idea of page speed as a valid ranking factor, strangely after Matt Cutts declaration that technical problems are not important.

From our point of view, website speed does not appear to have had much in the way of measurable impact at this stage. That said, a quick site is always beneficial for a better user experience and will usually result in greater numbers of page views.

There are a many best practise solutions to increasing website speed, including minimizing the number of items that need to be loaded on a page, ensuring images are optimised, combining scripts and CSS into an external file and using a CDN – Content Delivery Network.

These types of problems can be a major barrier to achieving good SERP rankings and often need to be corrected before other on and off page SEO can take place. Despite what Matt Cutts says, the need for technical SEO is still great and offers a perfect complement to other best practise techniques.

And that’s it, our top five countdown list is complete. To recap, the issues we covered were:

5. XML Sitemaps

4. Content

3. Page Titles & Meta Elements

2. Website Structure

1. Duplication/Technical Issues

Do you agree with the list? Perhaps we missed an issue you’re particularly interested in? Please leave any comments below. Happy holidays from Receptional!