Google Is Not Indexing Your Site! Want To Know Why??

In case, your site is not getting crawled, you are missing a lot on the block. It is an essential SEO feature under which Google indexes your website to achieve organic traffic from Google. Any site that fails to get indexed would mean that you are lost midst millions of useless websites. There would be no audience for your content as it will not be included in Google’s search index.

Before you fix the indexing bug, you need to diagnose the indexing issue. Below specified are some of the most common reasons for your site not getting indexed.

  1. Your Site is indexed under a www or Non-www Domain

It is important to note that www is a sub domain, which is why http://website.com is quite different from http://www.website.com. To make sure both the sites are indexed by Google, you have to add both the sites to your Google Webmaster Tools account. You will need to verify both the sites. However, you can always set a preferred domain.

Domain Name

  1. Google can’t Find your Site

New websites usually experience this. You may try waiting for a few days but in case Google still does not index your site, check whether your sitemap is uploaded rightly and is operating the right way. You should request Google crawl to fetch your site in this case. Here is what Google suggests:

  • Go to the Webmaster Tools Home page, hover and click over the required site.
  • Under the Crawl option, select Fetch as Google
  • Provide the path to the page that needs to be checked.
  • Select Desktop from the drop down list.
  • Once you click over Fetch, the requested URL will be fetched by Google.
  • After seeing the Fetch status of “Successful”, click Submit to Index. At the end you need to click for the indexing type based upon your choice and requirement.

Site Map

  1. Robots.txt is blocking your Site or Webpage

One of the major issues faced by websites is when the developer or editor has left your website blocked through robots.txt. No worries, the fix in this case is easy! You just need to remove that particular entry from robots.txt. Once this is done your website will reappear in the index.

Robot.txt

  1. Oops! Your Site Got De-indexed

If this is the case, you are in deep trouble down under.

In case, your site has been penalized manually and removed from the index, you would be notified about it beforehand. For those who have a website with shady past, the website might be on the verge of getting penalized, which could be the reason for de-indexation.

If this is the reason for your website getting de-indexed, you have to gear up for some extra hard work.

  1. Duplicate Content

The golden rule for any website is to stay away from any sort of duplicate content. In case, your website is loaded with lots and lots of duplicate content, you are actually confusing the search engines and leading them to de-index your website. When multiple URLs point towards the same content, you are actually confusing the search engine. You can rectify this issue by picking one page and redirecting rest to 301.

Avoid Duplicate Content

Kinex Media is an interactive agency that helps businesses through scalable and beautifully-designed websites. In case you have any question regarding the design, SEO, PPC, Magento or Ecommerce solutions, then reach out to us here.