Robots.txt Unreachable Error
-
I tried submitting my sitemap to Google Search Console and received the following error:
“Network unreachable: robots.txt unreachable. We were unable to crawl your Sitemap because we found a robots.txt file at the root of your site but were unable to download it. Please ensure that it is accessible or remove it completely.”
I have searched my site and page source for robot.txt and can find no problems. I use Yoast SEO and the settings in this plugin seem to be correct as well – as in everything is being indexed and nothing is set to noindex. Can anyone help me resolve this? What could be causing this problem?
The page I need help with: [log in to see the link]
Viewing 3 replies - 1 through 3 (of 3 total)
Viewing 3 replies - 1 through 3 (of 3 total)
- The topic ‘Robots.txt Unreachable Error’ is closed to new replies.