• I tried submitting my sitemap to Google Search Console and received the following error:

    “Network unreachable: robots.txt unreachable. We were unable to crawl your Sitemap because we found a robots.txt file at the root of your site but were unable to download it. Please ensure that it is accessible or remove it completely.”

    I have searched my site and page source for robot.txt and can find no problems. I use Yoast SEO and the settings in this plugin seem to be correct as well – as in everything is being indexed and nothing is set to noindex. Can anyone help me resolve this? What could be causing this problem?

    The page I need help with: [log in to see the link]

Viewing 3 replies - 1 through 3 (of 3 total)
  • Moderator t-p

    (@t-p)

    I recommend asking at https://wordpress.org/support/plugin/wordpress-seo so the plugin’s developers and support community can help you with this.

    Thread Starter sommerchaka

    (@sommerchaka)

    I work with many sites, all of which I use the Yoast plugin, and they do not have this problem. That is why I am wondering if the issue is within the wordpress theme itself. I don’t have any other plugins that use robots, that I am aware of. I am just unsure of how to find the root of the problem, so that I can fix it. If anyone is able to identify what is causing this, I would greatly appreciate it!

    Moderator t-p

    (@t-p)

    wondering if the issue is within the wordpress theme itself

    What’s the full name of your theme?
    Downloaded from?

Viewing 3 replies - 1 through 3 (of 3 total)
  • The topic ‘Robots.txt Unreachable Error’ is closed to new replies.