Plugin Directory

Test out the new Plugin Directory and let us know what you think.

Multisite Robots.txt Manager

A Simple Multisite Robots.txt Manager - Quickly and easily manage all robots.txt files on a WordPress Multisite Website Network.

Arbitrary section

View the Install Guide | Screenshots | Feedback

Understanding the Default Settings

When you first enter the plugin admin via the Network Admin, the shown robots.txt file is the default "network only" or "network wide" working copy. Modify the default robots.txt file, when done click the "update network" button to replicate the robots.txt file to all Network Websites.

The Network Append Marker

The marker {APPEND_WEBSITE_ROBOTSTXT} within the Network Robots.txt File is replaced by Website unique robots.txt file rules. Use the marker in your customized Network robots.txt files to automatically append the Website robots.txt file rules when the Network is updated.

Robots.txt Files within Directories

  • This plugin WILL render robots.txt files within directories - however,

  • Search Engine Spiders only read robots.txt files found within the root directory of a Website. Spiders do not read robots.txt files within directories, such as: domain.com/PATH-or-FOLDER/robots.txt is NOT a valid location.

  • From Google: "The robots.txt file must be in the top-level directory of the host.....Crawlers will not check for robots.txt files in sub-directories."

Testing Robots.txt Files

  • Use Google's Webmaster Tools to Validate your Robots.txt Files.... with Google at least.:
  • Log into your Google Account and access the Log into your Webmaster Tools feature. Select a Website or Add a Website....

  • On the Webmaster Tools Home page, click the site you want.

  • Under Health, click Blocked URLs.
  • If it is not already selected, click the Test robots.txt tab.
  • Copy the content of your robots.txt file, and paste it into the first box.
  • In the URLs box, list the site to test against.
  • In the User-agents list, select the user-agents you want.
  • https://developers.google.com/webmasters/control-crawl-index/docs/robots_txt

New Website Added to Network

  • If all Websites use the saved Network default robots.txt file, click the "update network" button to copy the default robots.txt file over to any new Websites you have.
  • Per Site: Then click the "update website rules" button to copy the default robots.txt file to this Website.


  • Disable a Website: To disable the Multisite Robots.txt Manager on a Website, select the Website from the dropdown menu, then click the "change sites" button. With the Website's robots.txt file open, click the "disable this website" button. This will clear the robots.txt file and sitemap structure settings for this Website only, making the WordPress default robots.txt file display.
  • Disable across the Network: Select the default robots.txt file within the Text Area, click the delete on your keyboard, then click the "publish to network" button. You can not save a blank default robots.txt file, but you can publish a blank robots.txt file, which will disable the robots.txt file option for each Website within the Network.


  • Reset Default: Something wrong? No worries! When viewing the Networks robots.txt file, click the "reset to default" button to replace the displayed robots.txt file with the core "coded in" default robots.txt file.
  • Reset Website: To reset a Websites robots.txt file, change to the Website within the dropdown, then click the "reset this website" button to pull in the "Networks Default Robots.txt file" (not the coded in default file).

Presets / Examples Tab

  • This feature allows you to quickly duplicate premade robots.txt files and a sitemap structure url, to either the default network wide robots.txt file or a selected Websites robots.txt file.
  • To use: Select the Network or a Website from the dropdown. Check the box to add a sitemap structure, modify/enter a Sitemap Structure (not required). Finally, click the "set as default" button above the robots.txt file example you want to use.

  • Presets can also use the Sitemap URL Structure setting. Read above on how to use this feature.

Recommended Sitemap Plugins

For "real" Multisite HOST Networks, use the WordPress plugin: BWP Google XML Sitemaps - This plugin will list each Websites Sitemap URL's in the Root Network Website's robots.txt file.

Requires: 3.8 or higher
Compatible up to: 4.6.0
Last Updated: 5 months ago
Active Installs: 300+


3.7 out of 5 stars


0 of 1 support threads in the last two months have been marked resolved.

Got something to say? Need help?


Not enough data

0 people say it works.
0 people say it's broken.

100,2,2 100,2,2