Support » How-To and Troubleshooting » Disallowing Permalink'd page in robots.txt
3 years, 10 months ago
I have custom permalinks plugin and I have set a page like so…
I was wondering, is it just as simple as adding
in the robots.txt if I don’t want it crawled?
That is correct but this will also disallow robots from accessing pages in the same structure as ‘/thisisapage1/’ eg ‘/thisisapage1/foo/’ will also be disallowed
Thank you very much!
Should I also disallow /tag/ and /category/ ? I’m guessing that helps to avoid duplicate content as the blog posts already have their own URL’s, right?
I’m no SEO expert, in my opinion search engines recognise tag and category pages and won’t penalise you for using them so I wouldn’t worry.