The bots have somehow become restricted from crawling my site. I made sure that I did not check mark the “Block search engines, but allow normal visitors” setting inside ClickBump’s SEO module.
Also made sure that I did not go private with my WordPress Reading settings by not placing a check mark inside the box next to “Discourage search engines from indexing this site”.
Additionally, since I am using Yoast WordPress SEO, I made sure that none of the indexing rules were implemented. I left this section in the default mode with none of the options checked.
However, now, when I do a site check for indexing on my site using the following search command in Google: site:lovephrasesx.com I get this message back in return,
lovephrasesx.com/Share A description for this result is not available because of this site’s robots.txt
This basically informs me that there is a restriction on indexing/crawling my site which has been implemented by a robots.txt file.??? Where did this file come from, and how was it created?
This is very disconcerting and I am not able to make sense of it given the settings I have made to my site.
Can anyone shed some light on this for me? How do I remedy this and how can I avoid this in the future?
- The topic ‘Bots have been restricted from my site, but don't know how it happened?’ is closed to new replies.