Posted by 2 Comments Others

A regular issue to owners of new websites, and even a worry to some owners of older websites is when their internal pages and sometimes even their homepage are not being indexed by search engines. Why could your pages not be being indexed?

Search engine spiders travel around the web and index as many of all the billions and billions of websites and web pages that they can. But they have priorities. Websites that have high link popularity will be crawled more frequently, while new sites may take days, weeks and even months before they are picked up by spiders and then indexed.

What can I do to speed up this process?

Links should be your main priority for getting your site picked up by search engines. The more links you have from pages on other sites that have been indexed then these are going to alert the spider to your website and speed up the time it takes for your site to be included in the search engines index.

My homepage is indexed but nothing else…

In this situation, search engines have found your homepage, but they are not picking up all of your pages, either because you do not have enough link popularity or your internal linking has made it too difficult for spiders to find the rest of the content on your site. If search engines have got to your content but not indexed it, then you’re lacking link popularity for search engines to consider indexing you. This generally happens for competitive search phrases, but you’re still capable of getting indexed for less-competitive terms even if you have weak links.

My site used to be indexed but not anymore…

In this situation, you may have upset the search engine, received a penalty and been kicked out of their index. Artificially building links? Linking to a site in a bad neighbourhood? Duplicate content? If you know you’ve done anything you shouldn’t have, then you’re paying the price by not being indexed.

One More Common Mistake

People commonly add pages to their robots.txt files by mistake.  This is a sure-fire way to prevent your pages from being indexed.  And those who do so probably don’t check their file, or even know what they’re doing.  Only disallow pages on your robots.txt file that you don’t want indexing!  It is easy to remove pages from this and make you will suffer no penalties for any pages that have been removed from there.

Google Sitemaps

Some people may tell you to use Google Sitemaps to fix the problem. I wouldn’t recommend doing so as you’re not fixing the constant problem that is preventing your site from being indexed. You are better to follow the above advice to solve your indexing issues.