SEO Tip for Fast Secure Contact Form:
SEO Tip for Fast Secure Contact Form:
Problem: There are two links used only for the CAPTCHA that could potentially get indexed by search engine spiders like Googlebot, MSNbot, etc. The links are for the “CAPTCHA Audio” and the “CAPTCHA Refresh”.
There is no need to have these two links indexed by a search engine. In version 2.8.2, I added rel=”nofollow” to the links so search engines will avoid indexing them. But some sites may already have some links indexed.
It is recommended to add this to your web site robots.txt:
Note: It may take some time for the indexed links to go away, this is normal.
More information on robots.txt:
oops, the correct robots.txt for this plugin:
Hi Mike. I came to the forum looking for info on this very topic, so I’m glad to see that you’ve already addressed it. Two quick questions:
1. Two pages/links from one of my sites have already been indexed:
Will the “disallow” that you posted take care of both, or do I need to add a second disallow for the “background” subdirectory as well?
2. I use the XML Sitemap Generator plugin (v3.2.4), and just read this in my settings: “The virtual robots.txt generated by WordPress is used. A real robots.txt file must NOT exist in the blog directory!” I’m guessing that means that I won’t be able to access the robots.txt file that you suggested modifying, however, there is an option in the XML Sitemap plugin for “Excluding Posts/Pages” by ID. Do you think adding the two indexed links to this plugin option will achieve the same result as what you’re proposing?
Thanks for your help!
The answer to 1. is yes, the “disallow”will take care of both since it disallows anything in /captcha-secureimage/ including subfolders
2. I looked at their faq and found this: (so the answer is yes)
Google Sitemaps and robots.txt
You can use the robots.txt file to inform search engines about your sitemap. If you activate this option at the administration panel, the plugin will try to create the file in your blog root. The “File permissions” status below the checkbox will give you a hint if this is possible or not. If the robots.txt file cannot be generated due to insufficient file permissions, please create the robots.txt file by yourself and make it writable via CHMOD. A good tutorial for changing file permissions can be found on the WordPress Codex. The plugin will NOT delete your existing robots.txt file but append the new values at the end.
Thanks Mike. My problem is that in order to exclude pages/posts using the XML Sitemap Generator plugin, it requires me to enter the page IDs. But since these two SI Contact Form captcha links aren’t actual pages that I created in WordPress, I’m unable to locate the IDs for them. Any ideas? Here some more info in case it helps:
Per the plugin FAQ page:
Exclude posts or pages
Here you can enter the IDs of posts or pages which will not be included in your sitemap. You can see the IDs of the post or pages in the corresponding management pages. Separate multiple IDs by comma.
Screenshot of plugin settings:
The faq said you can make a robots.txt and have in the site root folder. the sitemap generator will append to it.
so put this in the robots.txt in your site root folder.
I thought about doing that, but then I saw this option in the settings (which was “on” by default):
( X ) Add sitemap URL to the virtual robots.txt file.
The virtual robots.txt generated by WordPress is used. A real robots.txt file must NOT exist in the blog directory!
I don’t know what a “virtual” robots.txt is, but I’m worried about creating a file on my own based on this warning about having a “real” file in the directory. Thoughts?
- The topic ‘SEO Tip for Fast Secure Contact Form:’ is closed to new replies.