Googlebot can not access your site. your site’s overall robots.txt error rate is 100.0% (Resolved)

Googlebot can not access your site. your site’s overall robots.txt error rate is 100.0% (Resolved)

robots.txt problem solved
robots.txt problem solved

Over the last 24 hours, Googlebot encountered 11 errors while attempting to access your robots.txt. To ensure that we didn’t crawl any pages listed in that file, we postponed our crawl. Your site’s overall robots.txt error rate is 100.0%.

Recommended action

If the site error rate is 100%:

  • Using a web browser, attempt to access If you are able to access it from your browser, then your site may be configured to deny access to googlebot. Check the configuration of your firewall and site to ensure that you are not denying access to googlebot.
  • If your robots.txt is a static page, verify that your web service has proper permissions to access the file.
  • If your robots.txt is dynamically generated, verify that the scripts that generate the robots.txt are properly configured and have permission to run. Check the logs for your website to see if your scripts are failing, and if so attempt to diagnose the cause of the failure.

If the site error rate is less than 100%:

  • Using Webmaster Tools, find a day with a high error rate and examine the logs for your web server for that day. Look for errors accessing robots.txt in the logs for that day and fix the causes of those errors.
  • The most likely explanation is that your site is overloaded. Contact your hosting provider and discuss reconfiguring your web server or adding more resources to your website.
  • If your site redirects to another hostname, another possible explanation is that a URL on your site is redirecting to a hostname whose serving of its robots.txt file is exhibiting one or more of these issues.”
robots.txt problem solved
robots.txt problem solved

This message actually made me worried a bit, and I searched for it all over the net. And what I found in the end was that this is something very common, and it will be resolved by itself. I took a deep breath of relaxation after reading that.

Firstly, I thought that this could be because my website was showing a lot of 404 Errors as I recently deleted some pages, and shifted one part of my website to another domain. But this can never be a problem, as change in a website and missing pages are very common on internet, as internet is always changing and upgrading. I have seen more 404s on Google than on my website. 

Several days passed, I again received the same message from Google, that “Googlebot can’t access your sit”. I again took it lightly, as I thought that it will be resolved automatically after some time.

After a day, I saw that the traffic of my website has started to decrease. The reason why the catastrophic result  disappointed me a lot was that it took me so much time to learn about Search Engine Optimisation and then to apply everything to my website. Next day, I noticed that Google completely eliminated my website from the Google Search. 

Now, I realised that this is the time to start working. I read several articles about what could be the cause of it. After many days of research, I was able to point out some of the reasons that could be possible for this. 

1) Google Analytics Code might be wrong – Sometimes it blocks Google bot to crawl the website. So, without wasting much of my time, I checked it. Fortunately, nothing was wrong there. Though, I still changed and re-pasted the code on the older code. It doesn’t make sense though, but it satisfied me as I really didn’t want to be here again, knowing, that I might have missed some points when I checked it earlier. Everything was fine there.

2) Check your Robots.txt page in your browser as recommended by Google. So I checked, I opened

And it didn’t show any problem at all. It was easily accessible. Now, that increased my trouble more, as it was actually working for all the other search engines but not for Google. I also checked on Bing’s Webmaster tool and on Yandex’s Webmaster tool, and I was shocked to see that my website was working really fine there. Tthey didn’t have any problem with my robots.txt. To reconfirm it, I checked my Robots.Txt with several online robots.txt checker, and it was working all alright.  This certainly became a big puzzle for me at this point of time. 

3) Problem with the Server, website’s firewall is blocking Google bot from crawling the website.  I didn’t have any control over this. These kinds of things are usually in control of Website Hosting’s company. 

I checked my DNS by website based on DNS monitoring, or DNS checker. And there it was, my website was showing me error of two missing name servers. I opened the DNS tool and corrected it. As I was told by some SEO professionals that reflecting these changes may take up-to 15 days. In the mean time, I also kept on checking my Hosting as they said that they are trying their best to resolve it.  They said that they are in touch with Google regarding this, trying best to resolve the issue from their end. 

I started having this problem on  24- July – 2015 and it was resolved on 15-Sep-2015. It was the problem with the hosting DNS that was blocking Google Bot.

robots.txt problem solved
Working fine now

If you are having this problem, check all these points and I am sure it will work.  And be patient, it took me 54 days to resolve this problem. If you are having the same problem and anyone of the mentioned points helped you to solve the issue, please share it in the comments box. 

Leave a Reply

Close Menu