Robots.txt file blocking my website?
-
How do I unblock my website from Google within the theme? In my reading settings in the backend, I don’t have the checkmark box checked to block it from indexing on Google, so I’m not sure what’s happening.
It started happening Novemeber 30th. I’m using Yoast SEO plugin to generate the site maps but it won’t let me re-submit them in Google Search Console. I have errors.
The error is:
“Sitemap can be read, but has errors
General HTTP error
1 instance
We encountered an error while trying to access your Sitemap. Please ensure your Sitemap follows our guidelines and can be accessed at the location you provided and then resubmit.”I also have a robots.txt file in my public_html folder so it allows files to be read.
Not sure what’s happening! Please help.https://support.google.com/webmasters/answer/7489871?hl=en
Do I need to re-submit and regenerated from Yoast SEO?
I’m not sure what else to do???My robots.txt file says this:
User-agent: *
Disallow:The page I need help with: [log in to see the link]
The topic ‘Robots.txt file blocking my website?’ is closed to new replies.