Support » Plugin: SEOPress » Getting “Couldn’t fetch” status and “Sitemap could not be read” errors

  • Resolved craiggroshek

    (@craiggroshek)


    Hello, amazing plugin! I am having an issue with Google Search Console recognizing and crawling the sitemap and any pages/posts created since enabling your wonderful plugin.

    Basiicaly, I am gtting “Couldn’t fetch” status and “Sitemap could not be read” errors on the Console even after flushing permalinks.

    My sitemap is at: https://www.creepypasta.com/sitemaps.xml

    My robots file is: https://www.creepypasta.com/robots.txt

    Google says they indexed the sitemap, but that it was blocked by robots.txt. I can’t see anything in the Robots.txt file, however, that would do this.

    Under “Crawl allowed?” it returns the status: “No: blocked by robots.txt”

    And under “Page Fetch”, it returns: “Failed: Blocked by robots.txt”

    And status says “Couldn’t fetch” for every URL in the sitemap.

    What could the issue be?

Viewing 3 replies - 1 through 3 (of 3 total)
Viewing 3 replies - 1 through 3 (of 3 total)
  • You must be logged in to reply to this topic.