Hi @jvsouthwood
Upon checking your website https://londonsavate.co.uk/, we see it is accessible and it is using the latest version of Yoast SEO v16.8.
Upon checking your robots.txt file here, it doesn’t look like it is using the standard directives or rules, which might be causing an issue as well.
User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php Disallow: /wp-content/uploads/wpo-plugins-tables-list.json
You may want to check Google’s robots.txt file tester to get more information – https://support.google.com/webmasters/answer/6062598
You might need to make changes to your robots.txt file to ensure that the crawlers and bots aren’t disallowed and will be able to crawl your website – https://yoast.com/help/how-to-edit-robots-txt-through-yoast-seo/ & https://yoast.com/ultimate-guide-robots-txt/
It was by using the google robots Tester and not getting joy (original opening para) that prompted me to ask why it might be happening.
When you say ‘doesn’t look look it is using standard rules’ what in particular are you seeing? I am using a recommended version. Thanks.
Hi,
This is the default WordPress robots.txt file:
User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php
We’re not familiar with this other directive you’ve added:
Disallow: /wp-content/uploads/wpo-plugins-tables-list.json
When you test your robots.txt file in Google’s robots.txt tester, could you share more information on exactly what it is reporting? Perhaps there is information that could point as to why Google is unable to fetch your robots.txt file.
Yep, the result in Google’s robots.txt tester is the title of this thread:
Robots.txt cannot be fetched
You have a robots.txt file that we are currently unable to fetch. In such cases we stop crawling your site until we get hold of a robots.txt, or fall back to the last known good robots.txt file
As I said, this all started when an update to yoast took my site down on 27 July, so I am trying anything, any ideas at all to get back my indexing.
(Disallow: /wp-content/uploads/wpo-plugins-tables-list.json is from WP-optimise to guard against access to plugins, it’s fine and recommended, but I have removed the plugin now to eliminate possible causes).
Any ideas why google will not even look at my robots.txt in live testing? Anyone come across this before?
Thanks for your confirmation. While it might be a coincidence that you’re seeing the issue on Google Search Console after the Yoast SEO plugin update on 27th July, we don’t see any issues at all on the robots.txt file on your website. We can confirm that it’s perfectly fine and crawlable as well.
So, it might be an issue specific to Google Search Console as well. You may want to try resubmitting the robots.txt or reach out to the Google Webmaster Support for further assistance on this.
Note: we just rested your website using the Google Mobile-Friendly Test tool and can confirm that your website is crawlable once again. You could test it on your own from here: https://search.google.com/test/mobile-friendly?id=nek75lINniMjFOMmNOR-OQ
Hi, it seems to be fixed, yes. Thanks for alerting me 🙂
After trying everything, I come away thinking that the coincidence was that the bad gateway error caused by the plugin update (fixed within 10-20 minutes) just *happened* to coincide with a google crawl. And that subsequently google decided that the site didn’t exist for a few days. All’s well now though.