WordPress.org

Ready to get started?Download WordPress

Forums

SEO Ultimate
robots.txt (3 posts)

  1. goZzee
    Member
    Posted 1 year ago #

    I just finished a new site with SEO Ultimate installed & released Search Engine Visibility las night so I could submit a sitemap. Webmaster tools tells me that Googlebot is blocked & showed the following robots.txt

    # Added by SEO Ultimate's Link Mask Generator module
    User-agent: *
    Disallow: /go/
    # End Link Mask Generator output
    
    User-agent: *
    Disallow: /

    I have tried disabling Link Mask Generator & added a new robots.txt but that didn't work. Then I disabled the plugin totally, which still didn't work. I have implemented a physical robots.txt in my root as per this thread.

    http://wordpress.org/support/topic/add-actual-robotstxt-or-use-virtual-robotstxt?replies=27

    http://www.nottingham-wedding-photographer.com/sitemap.xml in browser shows that the new robots.txt

    User-agent: *
    Disallow: /cgi-bin
    Disallow: /wp-admin
    Disallow: /wp-includes
    Disallow: /wp-content
    
    Sitemap: http://www.nottingham-wedding-photographer.com/sitemap.xml

    but Webmaster tools is continuing to tell me that Googlebot is blocked.

    Help appreciated.

    http://wordpress.org/extend/plugins/seo-ultimate/

  2. AndrewHodges
    Member
    Posted 1 year ago #

    Have you tried this:

    User-agent: Google
    Disallow:
    
    User-agent: *
    Disallow: /

    It's not quite what you are looking for but my thought process would be disable the plug-in and remove the txt file. If thats OK add back this txt file which should specifically allow google. If thats OK add back your file and so on.

    I use this resource when stuck http://www.robotstxt.org.

  3. goZzee
    Member
    Posted 1 year ago #

    Thank you Andrew,

    I did try that but to no avail. So I did a google fetch & got this result.

    Fetch as Google
    This is how Googlebot fetched the page.
    URL: http://www.nottingham-wedding-photographer.com/
    Date: Wednesday, March 20, 2013 at 9:57:37 AM PDT
    Googlebot Type: Web
    Download Time (in milliseconds):
    The page could not be crawled at this time because it is blocked by the most recent robots.txt file Googlebot downloaded. Note that if you recently updated the robots.txt file, it may take up to two days before it's refreshed. You can find more information in the Help Center article about robots.txt.

    So it can take a few days to register the new robots.txt.

    Im so impatient :)

Topic Closed

This topic has been closed to new replies.

About this Plugin

About this Topic