@msimonetta You’re blocking Google, and all other search engines, from your site using your robots.txt – https://www.mommunitymarket.com/robots.txt
The rule Disallow: / blocks access to the entire site.
Thanks @wpsmort – i’ve since changed it to:
User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php
Sitemap: https://www.mommunitymarket.com/sitemap.xml
I have resbubmitted my sitemaps but it still says in Google Search Console “couldn’t fetch”. Anything I’m missing or have done incorrectly here?
@msimonetta That’s not what I see for your robots.txt. I see this:
User-agent: *
Disallow: /wp-admin/
Sitemap: https://www.mommunitymarket.com/sitemap.xml
Sitemap: https://www.mommunitymarket.com/page-sitemap.xml
Sitemap: https://www.mommunitymarket.com/product-sitemap.xml
Sitemap: https://www.mommunitymarket.com/product_cat-sitemap.xml
Sitemap: https://www.mommunitymarket.com/product_tag-sitemap.xml
It’s possible you have some aggressive caching going on. You’ll want to make sure your robots.txt and sitemap aren’t cached in any way.
You’ll then need to be patient as it could take a while before that error goes from GSC.
@wpsmort thanks again for the info. I’m confident in the current setup of my robots.txt that it should not prevent google to index/crawl/fetch. However, I just ran a URL Inspection test on https://www.mommunitymarket.com/product-sitemap.xml and GSC states it is blocked by robots.txt still. Is this simply a waiting game now or is my robots.txt still not setup correctly?
@msimonetta I suspect it’s the caching that I saw earlier. Watch over the next few days and see if they can reach your sitemap. There shouldn’t be any reason why they can’t now.