robots.txt
-
There is something odd happening here with WordPress and robots.txt. First, I am set to public viewing (http://www.dunnfit.com). I have a robots.txt file in my public_html root.
What WordPress returns depends on my browser!
If I use Firefox, it returns the WP default:
User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/HOWEVER, if I use Internet Explorer, I get a 403 Forbidden! My root robots.txt is set to 644.
In neither case is WordPress honoring my existing robots.txt
While I am a novice at WordPress, I know my way around computers fairly well and have tried a few things based on what I read in the forums so far:
1. Disabling all plugins, except Revolution Slider which just simply refused to deactivate. (Disabled them all as a group.)
2. Selected one of the stock themes
3. Renamed Plugins directoryNo matter what I do, it won’t honor my robots.txt, and I get either a 403 Forbidden or the default one depending on my browser.
I tried copying functions.php into a child theme to customize it there but mysql is complaining about a date declaration, so even that’s a less than simple fix.
-
This is the code from functions.php:
function do_robots() {
header( ‘Content-Type: text/plain; charset=utf-8’ );/**
* Fires when displaying the robots.txt file.
*
* @since 2.1.0
*/
do_action( ‘do_robotstxt’ );$output = “User-agent: *\n”;
$public = get_option( ‘blog_public’ );
if ( ‘0’ == $public ) {
$output .= “Disallow: /\n”;
} else {
$site_url = parse_url( site_url() );
$path = ( !empty( $site_url[‘path’] ) ) ? $site_url[‘path’] : ”;
$output .= “Disallow: $path/wp-admin/\n”;
$output .= “Disallow: $path/wp-includes/\n”;
}/**
* Filter the robots.txt output.
*
* @since 3.0.0
*
* @param string $output Robots.txt output.
* @param bool $public Whether the site is considered “public”.
*/
echo apply_filters( ‘robots_txt’, $output, $public );
}
- The topic ‘robots.txt’ is closed to new replies.