• If it’s absolutely wrong for me to ask here, please tell me to go away.

    I just thought to myself that where else I could be asking about that if not within SEO experts group!

    The problem is that the more I google for info and the more I read I get more confused about how robots.txt should look like for WordPress these days… All I took from googling around is that this is quite important to have a right robots.txt for SEO, and if formatted right it can have a huge impact on how high the website is being ranked.

    If anyone would be kind enough to point me in the right direction or post a sample bulletproof and SEO friendly robots.txt template here I would really appreciate that.

    Kind Regards

    http://wordpress.org/plugins/all-in-one-seo-pack/

Viewing 3 replies - 1 through 3 (of 3 total)
  • Thread Starter Kramarz

    (@kramarz)

    I mean…

    In WordPress codex they suggest something like that:

    Sitemap: http://www.example.com/sitemap.xml

    # Google Image
    User-agent: Googlebot-Image
    Disallow:
    Allow: /*

    # Google AdSense
    User-agent: Mediapartners-Google
    Disallow:

    # digg mirror
    User-agent: duggmirror
    Disallow: /

    # global
    User-agent: *
    Disallow: /cgi-bin/
    Disallow: /wp-admin/
    Disallow: /wp-includes/
    Disallow: /wp-content/plugins/
    Disallow: /wp-content/cache/
    Disallow: /wp-content/themes/
    Disallow: /trackback/
    Disallow: /feed/
    Disallow: /comments/
    Disallow: /category/*/*
    Disallow: */trackback/
    Disallow: */feed/
    Disallow: */comments/
    Disallow: /*?
    Allow: /wp-content/uploads/

    Does it look right, or is that too much?

    If anyone could throw a tip please 🙁

    Hi Kramarz,

    I think in general it’s fine to disallow internal directories like wp-admin, wp-includes, and plugins themes and cache in wp-content, as they do in the Codex example; I can also see why you might block feed and trackback, especially for badly behaving bots; some people disallow comments and category as well because they worry about duplicate content. The most important thing though is validating your robots.txt to make sure that it actually works and specifies what you expect. Currently All in One SEO Pack Pro has an additional module for cleaning up and validating robots.txt files, although it still depends on you to set up the rules the way you like them.

    Thread Starter Kramarz

    (@kramarz)

    Thanks Peter for bringing a little light to the darkness. Still can’t see the end of the road but that’s a good starting point.

Viewing 3 replies - 1 through 3 (of 3 total)
  • The topic ‘Any robots.txt tips? :-(’ is closed to new replies.