I am using the latest version of the transposh plugin . Its all working fine but there are some duplicate issues shown up in webmaster tools and i got this message from the Google webmaster team
“Google detected an significant increase in the number of URLs we were blocked from crawling due to authorization permission errors.”
I changed my robots.txt and here is my site http://softwarebuzzer.com/robots.txt
Is there any very good robots.txt , so that I can avoid the problem.
Due to all these, my post is not indexed sooner ! It’s getting hours to get index
Thanks for the support
[ Signature moderated. ]
- The topic ‘[Plugin: Transposh WordPress Translation] Best Robots.txt To avoid duplicates’ is closed to new replies.