Hello @oraev
Thanks for reaching out regarding the robots.txt file of your multisite installation.
Do you mean that the entry above appeared many times in the robots.txt file as I am unclear about the error you reported?
You can edit the file in the File Editor – WordPress > Yoast SEO > Tools as explained in this help article: https://yoast.com/help/how-to-edit-robots-txt-through-yoast-seo/
Do let us know how it goes.
Thread Starter
oraev
(@oraev)
Hello @maybellyne
The entry appears only once at the end of the robots.txt file, but it duplicates existing entries. Here is the error message in the file: “Found multiple rules like ‘User-agent: * This may cause the site to be crawled and indexed incorrectly.'”
I don’t have the ability to edit robots.txt as described in your link because there is no “File Editor” link in the plugin menu. I edited the wp-config.php file and removed the line
define( 'DISALLOW_FILE_EDIT', true )
but the file editor in the plugin did not appear.
Why does the plugin make changes to the robots.txt file on its own, without user intervention? And now have to suffer to delete these records. I’ve been using the plugin for several years, and this kind of tampering with the robots.txt file has never happened before.
Hey @oraev,
Thank you so much for your reply.
If you are still unable to edit it through the dashboard after removing that line, that’s most likely due to some specific WordPress configuration and there might be permissions on your hosting level that prevents WordPress (and in turn Yoast SEO) to be able to read/write to the robots.txt file.
As a workaround, the quickest way to edit the file is directly through your hosting panel if your web host has a file management solution on their control panel, or through SFTP.
Thread Starter
oraev
(@oraev)
I can’t directly edit the robots.txt file because it’s not physically on the hosting. I have a multisite WordPress installation as I wrote in the first post, and so the files are created for each subdomain virtually by the Robots.txt Editor plugin. When I edit the robots.txt file from the admin panel, I don’t see any Yoast SEO entries there. They are generated when I open this file via the link (domainname/robots.txt). Yoast SEO can change these virtual files, otherwise where did the entries that he himself made in robots.txt for each subdomain come from? Previously, these entries did not exist in robots.txt files. I believe the file entries came from the latest Yoast SEO updates, after which I received error reports in robots.txt due to duplicate entries that the Yoast SEO plugin AUTOMATICALLY added there. And I can’t fix it. This is your plugin’s bug.
Hi @oraev,
Thanks for following up. Since Yoast SEO v19.0, our plugin changes the Disallow: /wp-admin/
to Disallow:
in the default robots.txt file to follow our best-practice guide which means relying on your robots.txt file as little as possible, as you can read in this article.
If you’re using a 3rd-party plugin that also modifies your virtual robots.txt, Yoast SEO will not be able to detect those modifications and will still add the rule you mentioned at the end of the robots.txt file.
That said, if you have defined stricter rules in the 3rd-party plugin, the rules output by Yoast SEO won’t have any effect as they basically don’t block anything. Note that rules collapse (on multiple instances of the same user agent), and the most restrictive rules win.
By the way, can you please let us know what tool/service is giving that “error” for the robots.txt file?
This thread was marked resolved due to a lack of activity, but you’re always welcome to re-open the topic. Please read this post before opening a new request.
Thread Starter
oraev
(@oraev)
@monbauza, thanks for the clarification.
An error message in the robots.txt file is sent by the Yandex Webmaster service, whose analytics I use.
Thanks for letting us know, @oraev! We confirm that you can safely ignore that message regarding your robots.txt.