Support » Fixing WordPress » robots.txt causing problems in a serach in google

  • Hi,

    I know its an old topic and I have been reading many of them on this robots issue… But I have try everything and its is still not working.

    I have started a blog about 3 month ago, at the beginning I block through the Privacy settings in WordPress the indexing by the search engines.

    About a month ago after having some post and data on the site, i change the private settings to allow for indexing and search engines to access. However, after waiting for weeks and looking in google is still shows the following error message:
    My Spanish Experience A description for this result is not available because of this site’s robots.txt – learn more.

    If you click on the robots.txt link the browsers will show the following text message:
    User-agent: *
    Disallow: /%&%&%/wp-admin/
    Disallow: /%&%&%/wp-includes/

    I have tried the following with no luck:
    1.- disable all the plugins
    2.- change the name of the plugin folder through FTP
    3.- look through FTP to see if I actually had a robots.txt file anywhere in my site and there is no physical file

    I read a blog once that you have to go in some file inside worpress config installation, but I am afraid of touching the wrong things…

    any thoughts any help, will be welcome…


Viewing 1 replies (of 1 total)
Viewing 1 replies (of 1 total)
  • The topic ‘robots.txt causing problems in a serach in google’ is closed to new replies.