Google And Bing In 2017: 5 Things You Didn’t Know You Needed To Optimise For
We’ve covered the basics of SEO many times on the Receptional blog. But today, we would like to let you in on five advanced features that will tip the scales in your favour in 2017.
Optimise your Parameters
Parameters are affixed to URLs following a question mark (?), often in order to change page content, without having to create a new URL. Web developers love parameters, and most ecommerce solutions rely heavily on parameters in order to refine product specifications. For instance, a single product might have a number of colour variations, and each of these would be coded with a parameter, and a variable. For example, ?colour=blue, where blue is the variable.
What you probably didn’t know, is that each parameters can be treated a number of different ways by search engines.
Parameters, by default, can be:
- Crawled, treated as unique URLs and indexed accordingly
- Crawled and not indexed
- Crawled and treated as duplicates of other URLs (often still indexed)
- Not crawled and ignored completely
Unfortunately, search engines aren’t always best at knowing how to treat your parameters and can often make huge errors of judgement, potentially working against your ranking strategy. What’s more, search engines can waste the crawl resource it allows for your website crawling URLs that ultimately don’t affect the index anyway and pulls resource away from your most important pages.
Luckily both Google and Bing have given us the tools to help them understand our objectives in using them.
How To Optimise Parameters in Google
- Log in to Search Console (formally webmaster tools)
- Expand the ’Crawl’ tab in the left hand navigation
- Click on URL Parameters
- Accept responsibility for what you’re about to do by clicking ’I understand’ in the dialogue box
- You’ll now have a list of parameters that Google has identified
- Clicking on ’Edit’ will give you options for defining how Google will treat each one.
How To Optimise Parameters in Bing
- Log in to Webmaster Tools
- Click Parameters in the left hand navigation
- Add any parameters here that you want Bing to ignore (i.e. not crawl).
In addition to this, having permutations of groups of parameters can mean hundreds of thousands of additional URLs for search engines to have to deal with, which will sap your crawl allowance.
Optimise Crawl Rate
As we’ve touched upon, search engines will allocate an amount of crawl ’resource’ to your site, that means a number of URLs relative to your size and importance will receive priority for crawling and indexation.
Whilst Google offers very little flexibility in this area, Bing gives you the tools to choose the optimum time for crawling via Crawl Control in Webmaster Tools.
If your website operates a publishing model, then set your crawl to happen just prior to your fresh content being published. If you’re running an ecommerce site, then a non-peak time is probably easier on your servers.
If you’re feeling really ambitious, then combine this with sitemap optimisation.
Optimise your sitemaps
XML sitemaps are there to aid search engines in content discovery. Most SEO marketers will tell you that the site structure alone should allow for easy content discovery and that a sitemap is not a necessity. Whilst that’s mostly true, a sitemap can actually help to define crawl rates for search engines by prioritising the areas where fresh content is most likely to be discovered, and de-prioritising the pages that rarely change.
In addition to this, your sitemaps are also there to help you monitor indexation. If you break your content types up into separate sitemaps then it’s easier to work out if there are accessibility and/or indexation issues by monitoring them independently in Google Webmaster Tools. This handy graph will show you pages in sitemap vs pages in index for each sitemap you submit.
Optimise your security
Keeping your website free from hacked content and malware is essential for maintaining your traffic.
Leaving vulnerabilities open to exploitation can mark your domain with an anti-trust message in search engine results pages which will put potential customers off clicking. They might even get a red warning sign pop-up if they do click through.
Sometimes, Google will let you know in your Webmaster Tools/Search Console that you’ve been compromised and will remove the message once you’ve fixed the holes in your security. But some types of hacked content won’t be immediately picked up so you’ll need to monitor Analytics, Webmaster Tools and search engine indexes yourself:
Optimise your page load speed
A faster loading website means a better user experience. There are a number of quick-win methods to improve page load on most websites although the practicalities of each are best discussed on a case-by-case basis.
Minify your code
Get better at Browser Caching
You can set a longer ’life’ on your cacheable resources so that for returning visitors content is loaded that much faster.
Specify image dimensions
Specifying a width and height for your images allows browsers to render your images faster. If you’re not doing this, then images aren’t loaded at the correct size to start with, they’ll be re-scaled once the browser understands where they’re supposed to fit, but this is highly inefficient.
Optimise your mobile experience
Page load for mobile becomes even more important in context of the multi-device buying cycles. My closing tip for your business in 2017 is to review your customers’ user journey and make sure you’re providing the best mobile user experience possible.
If you need help optimising any of the above, then get in touch with Receptional today.