In this article, I show you how to fix Failed: Robots.txt unreachable in Google Search Console. Recently I had a challenge whereby Google stopped indexing my website in Google Search, and upon checking my sitemap in search console, I found out that my sitemap status read ‘couldn’t fetch’. Therefore, I had to troubleshoot and find out the cause of the issue and here, I had to do a URL inspection and see if I had any errors with my sitemap.
I realized that my robots.txt was unreachable and that was the issue why my sitemap couldn’t fetch thus Google stopped to index my website. Additionally, I had to confirm that my sitemap was error free, so I had to check for errors using a sitemap validator. Upon running a check, I confirmed that my sitemap was all good from any errors that could stop it from getting fetched.
The next step after running the sitemap check through the validator is now checking the Robots.txt file. I advise you head to Google search console and inspect the file. For starters, if you don’t know how to find or create the Robots.txt file, you can click HERE more so for those using YOAST SEO plugin in WordPress.
Well, once a ran the URL inspection, I found out that yes my robots.txt file was indeed unreachable. Additionally, just as we did for the sitemap validator to check for errors, also test the robots.txt file in the validator and testing tool for any possible errors.
As you can all see in the above screenshot, my file was free from any errors but again, I was seeing the same thing in search console that my sitemap status remained ‘couldn’t fetched’ because my robots.txt was unreachable.
This is how you fix Failed: Robots.txt unreachable in Google Search Console
Well firstly, I had now run short of all the possible solutions that could help me fix the error. I even tried searching YouTube for possible solutions but no body still solved the issue. However, I followed a simple guide on Google that really came close to to solving the problem. According to Google, here are the three issues that could have caused the problem.
If you’re getting an “unreachable: robots.txt” error, it could be because:
- There’s a problem with your firewall configuration
- Your server returned a 5xx error
- Your hosting provider is blocking Googlebot
Well, to cut the whole story shot, I considered the last bullet and contacted my hosting provider. I really don’t know what he did but he told me that he had reset the firewalls of my server and he told me to refresh and see whether the problem was solved.
I can boldly tell you that on refreshing, the problem was still there and now I was stranded. Lastly, something crossed my mind and told me to log in to my WordPress website and play around with the permalinks. When I did this, trust me I resubmitted my sitemap and boom, everything was perfect, the errors cleared and Google started to index my website after 24 hours. That is how I was able to fix the Failed: Robots.txt file unreachable error in Google Search Console. Additionally, my sitemap status changed from ‘couldn’t fetch’ to ‘success’ as you can see in the screenshot below
How to Create or Publish a blog post in WordPress for Beginners
How to install Yoast SEO plugin for WordPress Website
3 Main differences between Paid and Organic Search Advertising
Leave a Reply