In the past 5 days, I created a new website using the Weaver II version 2.0 theme. While making the website, I had gone into WP Admin>Settings>Reading and I had checked the box that says "Discourage search engines from indexing this site".
Two days ago, when I finished building this website, I went back to this box and unchecked it and saved my settings.
Subsequently, I went to my Twitter account ant tweeted the new Url... I did this several times.
Today, I went to Google Search and I searched for site:www.lendingpathmortgage.net
Google responds by telling me:
A description for this result is not available because of this site's robots.txt – learn more.
So, I went into WP admin and checked to be sure that I had removed my check on the box that I talked about above... and, it was unchecked.
Next, I went to my hosting account (Hostgator) and asked for their support. I told them about my problem and asked them to check my public_html file to be sure we werent blocking the url from the googlebots.
The support team member later reported to me that he couldn't find any robots.txt file... which he said can sometimes be a problem... so, he created one. However, he said he didn't put anything in it.
Several hours later, I went back to Google search and entered in site:www.lendingpathmortgage.net... I got the same response listed above.
Does anybody know what may be wrong and how to fix this?
Could this be a theme issue or is it a hosting platform issue?
HELP... I'm lost!