I have installed the Yoast SEO plugin for a client website, which has proved very useful for targeting page titles, descriptions and so forth. We have not used the plugin for anything else, because it's a little out of our comprehension - not being native developers.
However, my client's website has fallen foul of googlebot as of 14th June this year. So I configured Yoast to set a robots.txt file.
Not being a developer, I'm only guessing at how to edit it, and assume one has to remove the leading # to enable the line to be read? Is that right?
User-agent: * Disallow: /wp-admin/ Disallow: /wp-includes/ # Disallow: /wp-trackback # Disallow: /wp-feed # Disallow: /wp-comments Disallow: /wp-content/plugins Disallow: /wp-content/themes Disallow: /wp-login.php # Disallow: /wp-register.php # Disallow: /feed # Disallow: /trackback # Disallow: /cgi-bin # Disallow: /comments # Disallow: *?s= Sitemap: http://www.helenburrell.co.uk/sitemap.xml
I've also enabled creation of sitemap.xml and disabled another plugin which did the same thing.
My client's website is still not being indexed, despite these files being created.
Are these dynamically created files? Google says, '
If your robots.txt is dynamically generated, verify that the scripts that generate the robots.txt are properly configured and have permission to run.'
The logs on the web server and in cpanel make no sense at all, so I am going to have to get in touch with the webhost tech support. They, of course, are in the States, not UK. And the online login credentials don't work. I just hope they phone back.
I really need to know if WordPress SEO has been configured correctly before I start lambasting the webhosts for server errors.
We are using WordPress SEO 1.2.5