• Folks, I have 2 sitemaps, one for root and one for /blog

    From my robots.txt file below, am I correct in saying the following:
    1) all bots allowed to visit site?
    2) all agents are disallowed from visiting some root lever directories and WP directories that i do not want them visiting
    3) my WP blog will be visited and indexed ok
    would greatly appreciate any feedback, i have read through the forum and grabbed some of the following code from other posts that were talking about similar issues,
    Thanks
    T
    ****begin Robots.txt *****

    Sitemap: http://www.mysite.com/sitemap.xml
    Sitemap: http://www.mysite.com/blog/sitemap.xml

    # Google Image
    User-agent: Googlebot-Image
    Disallow:
    Allow: /*

    User-agent: *
    Allow: /

    # global
    User-agent: *
    Disallow: /cgi-bin/
    Disallow: /oldsite/
    Disallow: /js/
    Disallow: /ha-artwork/
    Disallow: /webalizer/
    Disallow: /scripts/
    Disallow: /cp/
    Disallow: /blog/wp-admin/
    Disallow: /blog/wp-includes/
    Disallow: /blog/wp-content/plugins/
    Disallow: /blog/wp-content/cache/
    Disallow: /blog/wp-content/themes/
    Disallow: /blog /trackback/
    Disallow: /blog/feed/
    Disallow: /blog/comments/
    Disallow: /blog/category/*/*
    Disallow: */comments/
    Disallow: /blog /*?
    Allow: /blog/wp-content/uploads/

  • The topic ‘quick Robots.txt question when I have 2 sitemaps’ is closed to new replies.