Posted by 0 Comments Others

Microsoft signalled at SES, San Jose ( August, 2007) that their search engine was due to launch their webmaster portal, following Yahoo and Google in this improved way for webmasters and search engines to communicate.

Google launched their WebmasterTools and before that Google Sitemaps some time back, but it has really been in the last six months that the functionality of these systems has improved dramatically. At its core is the ability for a webmaster to verify that they control their website. With this authentication comes the ability for the engine to give much more information to the webmaster than they could to an unverified user. On the other side of the coin, the webmaster is given extra tools – in particular a sitemap protocol to help the engines spider the user’s site efficiently.

Now using sitemaps as a protocol to tell engines like LiveSearch about your pages can be a double-edged sword as it might hide underlying problems in your code. However – the verification procedure and the resulting reports about your site and the increased trust between the site and the engine has no obvious downside for webmasters.

We haven’t seen the Live Webmaster Portal yet, but the Google version is well advanced. Their tools allow the following:

  • The ability to see what pages were crawled and when
  • Whether Google has found crawl problems
  • Whether Google found dead links
  • You can report web sites showing signs of spam
  • What keywords your site is most appearing for
  • What keywords your site is most getting clicked for
  • How many pages link to a specific page or to your site
  • Whether your site has a penalty,. and often the nature of that penalty
  • The chance to tell an engine the relative importance of different pages on the site

The move towards this level of communication is only set to increase and in the process is weeding out spam sites as these sites cannot easily verify themselves. Although it is not difficult for a spammer to verify anonymously, spam sites are generally transient and the more a spammer communicates with the engines, the larger their footprint.

Receptional has set up and used reports from Webmastertools for the vast majority of its clients and will continue to do so. This information also helps Receptional provide useful insights when we run their site search audits for both existing and new clients.

Website owners can register their interest for the Microsoft version by emailing lswmp@microsoft.com or looking at blogd.msdn.com/livesearch. Receptional consultants will be proactive in this regard for their retained clients.

If you would like a site audit of your site or for us to set up these systems for you, please contact Receptional.

Dixon Jones