WordPress.org

Ready to get started?Download WordPress

Forums

Multisite Robots.txt Manager
[resolved] Domain mapped multisite - no way to get distinct robots.txt? (18 posts)

  1. Petruha
    Member
    Posted 2 years ago #

    Hi!
    I have a domain mapped multisite. Main site is: myelectrons.com

    Second site in the network is domain mapped to myelectrons.ru

    Unfortunately I can not provision different robots.txt for my two sites. Please see the dashboard screen capture: http://myelectrons.com/?attachment_id=196

    Please tell me that I am missing something ;)
    Any help would be greatly appreciated! I am fine to assist as much as needed.

    Cheers,
    - Serge.

    http://wordpress.org/extend/plugins/multisite-robotstxt-manager/

  2. Petruha
    Member
    Posted 2 years ago #

    That's all folks. I give up on multisite: there are too many little hiccups with different plugins and themes. All that hassle does not outweighs the convenience of the language switcher.
    Thanks and best regards,
    - Serge.

  3. networkstudio
    Member
    Posted 2 years ago #

    We have the same issue.

    We're using the same "domain" (FQDN) for several blogs, what is different is the "path" (http://domain/path).
    In the current dropbox, we're seeing only the domain, not full URL (domain+path) so it's not possible to choose the blog we want to work on.

    Could you change the dropbox content to display the full URL of each sites of the multi-sites?

  4. tribalNerd
    Member
    Plugin Author

    Posted 2 years ago #

    Hello Petruha & networkstudio,
    The plugin has been updated to version 0.2.0, which solves the issue with the site names in the dropdown list.

    http://wordpress.org/extend/plugins/multisite-robotstxt-manager/

    http://downloads.wordpress.org/plugin/multisite-robotstxt-manager.0.2.0.zip

    ~Chris

  5. Petruha
    Member
    Posted 2 years ago #

    Hi Chris,
    thanks a bunch for quick fix!

    I am sorry for not being able to test it right now since I migrated my current works from multisite to separate WP installs (due to several other issues I had with it). I am sure the fix will simplify life for many other good people ;)

    Since I opened this thread and the initial issue must be closed by now - I'll mark it as "resolved".
    Cheers,
    - Serge.

  6. networkstudio
    Member
    Posted 2 years ago #

    Sorry but it's not solved.

    In the drop down list I now have "Network Wide" as first item, then the blogname of the default website (can be read as "(1) Blogname") as second item.

    I can not see the other blogs of the multisite.

  7. tribalNerd
    Member
    Plugin Author

    Posted 2 years ago #

    Hello networkstudio,
    Are you able to access the other blogs within the Network?

    The plugin uses a built in WordPress feature to gather the site list based on the sites your user is authorized to manage. As well, it will only update the Websites your user is authorized to manage.

    If you're not able to access the other blogs, then the Plugin wouldn't be able to access them as well.

    To change this, within the Network Admin -> access the Sites Tab > All Sites. Click the Edit link for a Website, then the Users Tab. Add your user to the Website with the Admin role set. This will add the Website(s) to the drop down list.

    ~Chris

  8. networkstudio
    Member
    Posted 2 years ago #

    You're right, this is it!

    I (wrongly) assumed that the default admin user was a "global" admin for all sites while he's not.

    Now testing more deeply...

  9. networkstudio
    Member
    Posted 2 years ago #

    I can change the robots.txt per blog (as per the admin settings page).

    However, I can not see it when connecting to its URL, both as http://www.network-url.tld/blogname-url or http://www.blogname-url.tld (when used with "WordPress MU Domain Mapping" extension).

    Is there anything to put in the .htaccess file (or equivalent regexp for NGINX)?

  10. tribalNerd
    Member
    Plugin Author

    Posted 2 years ago #

    Hello again networkstudio,
    The plugin was created on a network using the WP Mu Domain Map plugin. Non-mapped sites, still in a directory, wont get a robots.txt file (spiders don't read them either way), all mapped domains should.

    The plugin uses built in WP features to display the robots.txt file, the same calls WP uses directly, and what other plugins use.

    Speaking of other plugins, make sure you don't have another robots.txt plugin running, even a non-multisite version. If not, have you tried a different plugin to see if the issue goes away? If so.... which plugin?

    ~Chris

  11. networkstudio
    Member
    Posted 2 years ago #

    Hi

    We're using "WordPress MU Domain Mapping", I guess it's the same than yours (could not find "Mu domain map plugin" as a plugin).
    No other plugin to handle the robots.txt file(s).

    Non-mapped sites don't get robots.txt, ok for this.
    However, the mapped site we tried it on is not getting its robots.txt, we're getting a 404 error.

    We're not using apache (and .htaccess) but nginx, so this might be related (is_robots not correctly handled?).

  12. phuynh
    Member
    Posted 1 year ago #

    Chris, could you direct me to a thread which might help me resolve my problem: I'm also using the multisite with domain mapping. I can't make any changes to the hard coded robots.txt. It looks like I can, however, it is misleading because when I click on the view robot.txt I see nothing but the default coded version and not my changed version.

  13. tribalNerd
    Member
    Plugin Author

    Posted 1 year ago #

    Hello phuynh,
    The Website(s) in question have to be mapped to a domain, if the Website are still in a directory the robots.txt file will not update.

    ~Chris

  14. phuynh
    Member
    Posted 1 year ago #

    Chris, thanks for the reply, the website was mapped to a domain. Is there a known conflict with other plugins?
    Peter

  15. tribalNerd
    Member
    Plugin Author

    Posted 1 year ago #

    Hello again phuynh,
    I haven't heard of any conflicts, outside of other robots.txt plugins. It is possible that a sitemap plugin could be conflicting. I have tested most of the major sitemap plugins though. You could of course go through and disable plugins to see if a conflict is happening...again though, I haven't heard of any such conflicts, but anything is possible.

    When you publish to network, are any sites updating? How about when you change to a Website from the drop down and update the site directly? When you save your default robots.txt file, does that work correctly?

    Also, try this... Network Admin > Sites > Edit a mapped site > Click the settings tab. Then in your browser do a find for the word "robots" and see if any other robots.txt files have been set and not removed by another plugin. You should only see: MS Robots.txt - However, if some other plugin named the setting incorrectly (like is_robots) or some name WordPress typically uses, that may create a conflict.

    Outside of this, my attempts to debug via a support ticket is a bit limited. If you would like me to onsite the issue, I would need access to your WordPress install. Which I'm more than willing to do, but can understand why you wouldn't want this.

    ~Chris

  16. phuynh
    Member
    Posted 1 year ago #

    Chris,

    The problem was due to conflicting robots.txt; I had before installing your plugin deactivated and removed Kb Robots.txt, however, following your direction above, the Kb Robots.txt still exists in the site setting. I have checked every where; files in the server and tables in the database, but still can't find anything closely linked to Kb Robots.txt. At the end, I had to manually deleted the content but leaving the file Kb Robots.txt without -- any contents.

    Second, I had a Sitemap Feed Plugin that was stored in the Mu Plugin folder. It affected the function of your plugin. When I removed this plugin, along with a few other conflicting sitemap plugins. Your plugin was then able to function as described.

    I did have a problem; minor, I'm using this plugin with BWP GXS plugin. If the "Add sitemapindex to individual blog's virtual robots.txt? " is checked this will wipe out any contents in the default robots.txt file.

    Many thanks for your help, it works wonderfully.

  17. kagnar
    Member
    Posted 1 year ago #

    Serge,

    Multisite has a lot of positive and negative aspects, i really like using the domain mapping for development.

    But there is a better solution that has all the positives! Manage Wp WordPress Dashboard Check it out you get 5 free sites and it brings them all into one dashboard to manage... And its pretty cheap to upgrade imo

    Chris,
    I noticed a problem when i created a robots file for a site in a sub-durectory and then mapped it to its new domain. When i tried updating with the dashboard, i got the 404 that everyone is discussing. Once i reset the file from Network admin it worked!

    Hope this helps

    - Kagen

  18. tribalNerd
    Member
    Plugin Author

    Posted 1 year ago #

    Hello phuynh,
    Thanks for the update.... let me know if you have any other issues.

    Hello kagnar,
    The recent version 0.3.1 should resolve the 404 issue. Let me know if you're still having issues.

    ~Chris

Topic Closed

This topic has been closed to new replies.

About this Plugin

About this Topic