I was recently checking with Google’s Webmaster Tools to see if I was having any issues with this site. I discovered that every single page had an ” robots.txt unreachable” error attached to it. Gotta admit that I was rather surprised when I saw that. I hadn’t put in a robots.txt file when I moved over to Movable Type and really hadn’t thought of it. I mean not having one in the past has never caused an issue in the past. Why should now be any different.
Imaging my surprise when I read the help file on the error:
Before we crawled the pages of your site, we tried to check your robots.txt file to ensure we didn’t crawl any pages that you had roboted out. However, your robots.txt file was unreachable. To make sure we didn’t crawl any pages listed in that file, we postponed our crawl. When this happens, we return to your site later and crawl it once we can reach your robots.txt file. Note that this is different from a 404 response when looking for a robots.txt file. If we receive a 404, we assume that a robots.txt file does not exist and we continue the crawl.
Now I state again that I didn’t have a robots.txt file in place. Per that FAQ, a missing robots.txt file shouldn’t be an issue but, for some reason, it was in this case. I quickly created an empty file within the backend and waited to see if it was resolved. It was.
Just wanted to pass that along. Hope it helps someone else with that issue.