Support » Plugin: Yoast SEO » Robots.txt file blocking site from being indexed?

  • Resolved haltek18

    (@haltek18)


    So, I used your plugin to generate sitemaps, and I submitted them to Google Search Console, but a few weeks ago, randomly, I got errors with the XML files I submitted to the Google Search Console, saying, “We encountered an error while trying to access your Sitemap. Please ensure your Sitemap follows our guidelines and can be accessed at the location you provided and then resubmit,” when I clicked on the reason why. So I’m pretty sure that my sitemaps met all the guidelines, and Google is saying the reason for this is that my robots.txt file is restricting my content. Well, when I checked a few weeks ago, and I had no robots.txt file, and I created an empty file, but have no idea what to put in there for my sitemaps to start indexing again. Why is this happening? I understand that Google recently changed to a new Google Search Console interface, but I’m not sure if it’s a bug with that interface, or within this plugin myself. I did check the old interface, and I did have errors there as well. So don’t think it’s that. I did submit a query to the Google Help forums, but they said it was a bug in the new console, which I’m not 100% certain it is. I also submitted a query to my WordPress theme, and acknowledged that my Reading settings do not block my website from being read on Google, and they said they have nothing to do with it. So I’m thinking it’s something with this plugin or Google. Has anyone else experienced this and would know what to do in this situation? Thank you.

    The page I need help with: [log in to see the link]

Viewing 1 replies (of 1 total)
Viewing 1 replies (of 1 total)
  • The topic ‘Robots.txt file blocking site from being indexed?’ is closed to new replies.