WordPress.org

Ready to get started?Download WordPress

Forums

Transposh WordPress Translation
Best Robots.txt To avoid duplicates (2 posts)

  1. sureshpeters
    Member
    Posted 1 year ago #

    Hello ,

    I am using the latest version of the transposh plugin . Its all working fine but there are some duplicate issues shown up in webmaster tools and i got this message from the Google webmaster team
    "Google detected an significant increase in the number of URLs we were blocked from crawling due to authorization permission errors."

    I changed my robots.txt and here is my site http://softwarebuzzer.com/robots.txt

    Is there any very good robots.txt , so that I can avoid the problem.

    Due to all these, my post is not indexed sooner ! It's getting hours to get index

    Thanks for the support
    Suresh
    [ Signature moderated. ]

    http://wordpress.org/extend/plugins/transposh-translation-filter-for-wordpress/

  2. Ofer Wald
    Member
    Plugin Author

    Posted 1 year ago #

    Hello @Suresh,

    Please go deeper in the log and find urls that have authorization permission errors (403 in your logs).

    Assuming you have a page requiring authorization, and you had a link to it, having Transposh will multiply the number of links to that page (by the number of languages).

    This can be solved by specifying this url in your robots.txt, or probably just by removing said link.

    Not sure this is a real Transposh issue...

    Best.

Topic Closed

This topic has been closed to new replies.

About this Plugin

About this Topic