Hello @alrsl
Thanks for reaching out about your robots.txt file. Like you, I don’t see any issues with the file as it opens correctly in the browser. Could you share a screenshot of the error/notification you got about it in the Google Search Console?
You can use any image-sharing service like https://pasteboard.co/, https://snag.gy/, https://imgur.com/, https://snipboard.io/, or even upload the screenshot to your own website. Once you upload it to an image-sharing service, please share the link to the image here.
Thread Starter
alrsl
(@alrsl)
At this moment, your robots.txt file loads fine in the browser. There’s no directive blocking indexing, and it meets our recommended guidelines. However, before Google crawled the pages of your site, they tried to check the file to ensure they didn’t crawl any pages that you had roboted out. It’s likely your server returned a 5xx (unreachable) error when they tried to retrieve your robots.txt file. To make sure they didn’t crawl any pages listed in that file, they postponed the crawl. You can read more in Google’s documentation.
Your web hosting provider may be blocking Googlebot, or there may be a problem with the configuration of their firewall. Please note that the contents of the robots.txt file are different in your browser than what Google/spiders/bots see. Work with your web hosting provider again to remove any server rules that might show different robots.txt content to different user agents.
Thread Starter
alrsl
(@alrsl)
Yes, i saw a 500 error, while retrieving robots.txt, but only on the 28.6.24. I think the server was down for a short time.
I sent the same questions you pointed out to the hotline of my hoster, 1blu. Hope they have an idea.
At this time thank you! The issue is not solved.
al
Thread Starter
alrsl
(@alrsl)
The reason was the firewall setting at my hoster 1blu. The problem is solved now.
Thank you!