Transposh WordPress Translation
Best Robots.txt To avoid duplicates (2 posts)

  1. sureshpeters
    Posted 3 years ago #

    Hello ,

    I am using the latest version of the transposh plugin . Its all working fine but there are some duplicate issues shown up in webmaster tools and i got this message from the Google webmaster team
    "Google detected an significant increase in the number of URLs we were blocked from crawling due to authorization permission errors."

    I changed my robots.txt and here is my site http://softwarebuzzer.com/robots.txt

    Is there any very good robots.txt , so that I can avoid the problem.

    Due to all these, my post is not indexed sooner ! It's getting hours to get index

    Thanks for the support
    [ Signature moderated. ]


  2. Ofer Wald
    Plugin Author

    Posted 3 years ago #

    Hello @Suresh,

    Please go deeper in the log and find urls that have authorization permission errors (403 in your logs).

    Assuming you have a page requiring authorization, and you had a link to it, having Transposh will multiply the number of links to that page (by the number of languages).

    This can be solved by specifying this url in your robots.txt, or probably just by removing said link.

    Not sure this is a real Transposh issue...


Topic Closed

This topic has been closed to new replies.

About this Plugin

  • Transposh WordPress Translation
  • Frequently Asked Questions
  • Support Threads
  • Reviews

About this Topic