• Resolved alrsl

    (@alrsl)


    Hello,

    since 2-3 weeks the robots.txt is unreachable for Google Search Console. This blocks crawling and so on. With browsers and some checkers i can’t see any error. Robots.txt and sitemap_index.xml seem to be OK. I think there was an update of Yoast. Any suggestions? What to do?

    Yours

    al

    The page I need help with: [log in to see the link]

Viewing 5 replies - 1 through 5 (of 5 total)
  • Plugin Support Maybellyne

    (@maybellyne)

    Hello @alrsl

    Thanks for reaching out about your robots.txt file. Like you, I don’t see any issues with the file as it opens correctly in the browser. Could you share a screenshot of the error/notification you got about it in the Google Search Console?

    You can use any image-sharing service like https://pasteboard.co/, https://snag.gy/, https://imgur.com/, https://snipboard.io/, or even upload the screenshot to your own website. Once you upload it to an image-sharing service, please share the link to the image here.

    Thread Starter alrsl

    (@alrsl)

    https://www.lang-rs.de/2024/07/16/robots-txt-error/

    I will delete this post soon!

    Thank you!

    Plugin Support Maybellyne

    (@maybellyne)

    At this moment, your robots.txt file loads fine in the browser. There’s no directive blocking indexing, and it meets our recommended guidelines. However, before Google crawled the pages of your site, they tried to check the file to ensure they didn’t crawl any pages that you had roboted out. It’s likely your server returned a 5xx (unreachable) error when they tried to retrieve your robots.txt file. To make sure they didn’t crawl any pages listed in that file, they postponed the crawl. You can read more in Google’s documentation.

    Your web hosting provider may be blocking Googlebot, or there may be a problem with the configuration of their firewall. Please note that the contents of the robots.txt file are different in your browser than what Google/spiders/bots see. Work with your web hosting provider again to remove any server rules that might show different robots.txt content to different user agents.

    Thread Starter alrsl

    (@alrsl)

    Yes, i saw a 500 error, while retrieving robots.txt, but only on the 28.6.24. I think the server was down for a short time.

    I sent the same questions you pointed out to the hotline of my hoster, 1blu. Hope they have an idea.

    At this time thank you! The issue is not solved.

    al

    Thread Starter alrsl

    (@alrsl)

    The reason was the firewall setting at my hoster 1blu. The problem is solved now.

    Thank you!

Viewing 5 replies - 1 through 5 (of 5 total)
  • You must be logged in to reply to this topic.