Google Indexing and Related Issues

  • Post author:
  • Post category:General
You are currently viewing Google Indexing and Related Issues

wrenches3-Google-1900px--1441799848

When a new site is created, the best way to popularize it is through the search engine Google. For this to work effectively, Google needs to index the site. Normally indexing is a time-consuming process. But it can be made faster if you can keep relevant contents as per the standards set by Google.

Listed below are some of the common issues related to Google indexing. This list will help you to manage your website more effectively:

(1) Site yet to discover Google

Most new websites often report this issue. Normally, it takes a few days for Google to index the site. If it is still too late, make sure that the sitemap is uploaded and working properly. If you fail to create or submit a sitemap, you need to request Google crawl and fetch your site.

The following are some instructions provided by Google:

=> On the Webmaster Tools Home page – click the site you want.

=> On the Dashboard, under Crawl – click Fetch as Google.

=> In the text box, type the path to the page you want to check.

=> In the dropdown list – select Desktop.

=> Click Fetch. Google will fetch the URL you requested. It may take few minutes for Fetch status to be updated.

=> Once you see the Fetch status as “Successful” – click Submit to Index and then click one of the following:

=> To submit the individual URL to Google’s index, select URL and click Submit. You can submit up to 500 URLs a week in this way.

=> To submit the URL and all pages linked from it, click URL and all linked page

(2) Absence of sitemap.xml

Sitemap.xml contains the list of directions that Google should follow to index the site. If you encounter any issue with the indexation on any portion of your site, then the better solution is to revise and resubmit your sitemap.xml.

The following link will help you to understand Google’s Sitemap Policy and how to create a sitemap.xml.

Google’s Sitemap Policy :  https://support.google.com/webmasters/answer/156184

Sitemap.xml Generator :  https://www.xml-sitemaps.com/

(3) Crawl errors

In some cases, Google fails to index some pages as it cannot crawl them. Even though it cannot crawl, Google manages to see them. Crawlers error could be identified via Google Webmaster Tools. Following steps could be followed to identify the crawlers error.

Google Webmaster Tools → Select your site, → Click on ‘Crawl’ → Click on ‘Crawl Errors’. If you have any errors related to unindexed pages you will see them in the list of top 1,000 pages with errors.

(4) The site or pages are blocked with robots.txt

Some sites or pages of your site can be blocked using robots.txt file. This can be fixed by cross checking with the robots.txt file of your website and removing the entry from the robots.txt which are blocking your site. Once it is done, the site will reappear in the index.

(5) Site indexed under a www- or non-www

You may have to make sure that both the site links (eg: http://example.com and http://www.example.com ) should be added to your Google Web Master Tool account so that both of them could be indexed. Be sure to set your preferred domain, but verify ownership of both.

(6) Duplicate Content

In some cases, the search engine may get confused if your site has too much of duplicate contents. This affects the indexing of the site. If your site returns same content against multiple URLs, then the issue of duplication appears. This problem can be fixed by setting 301 redirects on the other URLs and set one page as genuine.

(7) .htaccess may be blocking your site.

.htaccess file is highly useful, which can be written for apache, and it is used to block the crawlers and the indexation.

(8) Website Has NOINDEX in the Meta Tag

If you have pages that haven’t indexed, make sure that the following meta tag source code is present or not. If so remove this line of code, so that you would be back in the index in no time.

<META NAME=”ROBOTS” CONTENT=”NOINDEX, NOFOLLOW”>

(9) AJAX/JavaScript Issues

Apart from the HTML language sites, which are developed using the AJAX/JavaScript, the other sites are not easily indexable. In case of incorrect configuration or coding of the AJAX pages and JavaScript execution, the HTML language pages will also find it difficult to be indexed properly.

(10) Longer loading time

If the site takes too much time to load, this can affect the ranking of the site. This happens, when the crawlers encounter with the interminable load times and make them unable to index the site.

(11) De-indexed or penalized by Google

Penalizing by Google will affect the indexation of the website. There are many reasons why the site attracts Google penalty. If you don’t deal with the issue in the right manner, then the site may be removed from Google.

CONCLUSION:

Indexation is the key when it comes to gauging the success of SEO. If your site or certain pages of your site aren’t indexed, you need to figure out why. We hope this blog will help you to fix most of the issues related to Google indexing. All the aforementioned troubleshoots are easy to follow. Most mistakes are minute, but it results in a larger issue of generating less traffic to the website.

Leave a Reply