I’m working with wordpress multisite, and have verified that the primary blog is set to allow crawlers in the privacy settings. Unfortunately, the generated robots.txt file is still showing disallow for all the sites. Any ideas on why this would be the case and how to fix it?
WordPress doesn’t naturally make the robots.txt file. Edit it manually.
The thing is that I don’t have a manual robots.txt file. It’s definitely being generated by wordpress and added to by the google sitemap xml for multisite plugin. I deactivated all plugins but the file is still generated.
The thing is that when I first started the multisite I had it set to private. I don’t recall ever changing this, but now it’s set to public and the robots.txt file is being generated to disallow.
That’s contradictory. Either you have a file called robots.txt or you don’t. If you do, you edit it. If you don’t, then perhaps you’re referring to something else? Like meta data perhaps?
Go look at your site via ftp. Is there a robots.txt file in there? Your first posts says there is, your next post says it was generated by a plugin.
you can still edit it. manually.
Hi, funnily enough I’m having the same problem: I’ve just set a site to ‘search engines allowed’, and uploaded a robots.txt to domain/wp-content/themes/theme and when I go to domain.co.uk/robots.txt, my uploaded file isnt there?
Because you uploaded it to the theme folder, not the root.
and uploaded a robots.txt to domain/wp-content/themes/theme
Upload it to /domain/
There isn’t a physical robots.txt file in the wordpress root, and it isn’t being generated by a plugin. It’s the virtual robots file generated by wordpress when your blog is set to private. Unfortunately, my blog is now set to public but is still generating the wrong robots file.
I’ve deactivated all plugins and confirmed that the file is still being created, Which leads me to believe I must have done something wrong with the multisite install somehow. I can’t create my own robots file to override it because I’m depending on the goole XML sitemaps plugin which requires you to use wordpress for generating the robots file. Any ideas?
Sorry, I guess I can see the issue with my first post. I meant generated robots file as in the one created on the fly by wordpress. There is no physical robots file in my root directory.
Can you get to it at domain.com/robots.txt ? If indeed it IS virtual, it should still show up.
I’m really skeptical that it’s actual a robots.txt file, virtual or otherwise, and strongly suspect you mean something else, but since that’s the familiar term, that’s what you’re using. If there’s no physical file, how do you know it’s there? You understand why I’m a skeptic here? I have manually created robots.txt files for my servers, and I’ve used XML sitemap plugins before by simply adding in the XML info to the file.
Now. How are you determining that your robots.txt file is there?
And if you give us the URL of the site, we can solve it in maybe five minutes.
Yes, it is available at http://brandedlocal.com/robots.txt.
You can tell Its generated by wordpress and not a physical file because of the sitemaps line. Every blog in the network receives it’s own robots file with it’s own sitemap line. That wouldn’t be the case with a static file.
Okay, that is most likely created by a plugin (I’d put money down on your site map plugin). Not wordpress. WordPress doesn’t make one, and it sure as heck doesn’t make one with the site map bit at the end.
The disallow line is blank, and IIRC it has to be
To actually disallow all.
The blank is allow all, basically. Read this: http://www.robotstxt.org/robotstxt.html
WordPress does make the robots.txt, virtually. And any plugin can hook into it to add its own rules via do_robotstxt or do_robots. do_robots() is located in wp-includes/functions.php.
If you actually have a file there, WordPress doesn’t generate robots.txt.
- The topic ‘robots.txt set to disallow, can't change’ is closed to new replies.