Plugin Author
AITpro
(@aitpro)
It is better to tell Google and other search engines not to crawl and index the /search/ URI in your WordPress virtual robots code.
http://forum.ait-pro.com/forums/topic/wordpress-robots-txt-wordpress-virtual-robots-txt/#post-6523
// WordPress Virtual robots.txt additions
add_filter( 'robots_txt', 'v_robots', 10, 2 );
function v_robots( $output, $public ) {
$output .= "Disallow: /wp-login.php" . "\n";
$output .= "Disallow: /search/" . "\n";
return $output;
}
Thread Starter
mrppp
(@mrppp)
I mean
User-agent: *
Disallow: /
Disallow: /cgi-bin/
Disallow: /wp-admin/
Disallow: /search/
Disallow: /wp-login.php
Plugin Author
AITpro
(@aitpro)
It is recommended by me and the WP folks that you use the WordPress virtual robots function in your theme’s functions.php file instead of a plain text robots.txt file, but yes you can do that instead if you want to. I read somewhere a long time ago why it is better to use the WordPress virtual robots function, but I can’t remember why that is.
Thread Starter
mrppp
(@mrppp)
Ok thanks, i,ll ask the theme designer where i can place it. thanks once again
Thread Starter
mrppp
(@mrppp)
just correct my last post incase someone ever uses it
User-agent: *
Disallow:
Disallow: /cgi-bin/
Disallow: /wp-admin/
Disallow: /search/
Disallow: /wp-login.php