Viewing 8 replies - 1 through 8 (of 8 total)
  • Hi Marc, have you physically created and uploaded robots.txt? Because this is not something TSF does automatically (if something hasn’t changed). Can we please have the URL where the file should be? Have you tried disabling other plugins, especially any security or protection stuff? Have you set the right permission for robots.txt via FTP, for example full on 777?

    This problem seems unrelated to the plugin, it is rather generic, so please also explore google for “404 error robots.txt wordress”, you might find your answer this way even faster 🙂

    Thread Starter Marc

    (@arbolife)

    Hi @lebaux,

    Thanks for your answer. I didn’t know I had to create the robots.txt myself, I thought it would be generated automatically. I first created an empty file with 777 rights, but it only displayed an empty page, so nothing gets dynamically generated. For now I entered manual content and it works, but I don’t think that’s how it’s supposed to be with WordPress and the SEO Framework.

    Here’s the address: https://www.arbolife.com/robots.txt

    Thanks,
    Marc

    Plugin Author Sybre Waaijer

    (@cybr)

    Hi guys,

    The SEO Framework does handle robots.txt, but it doesn’t initiate its output.
    WordPress tries to output a virtual robots.txt file, which TSF then takes over through filters.

    When the robots.txt file can’t be outputted, then there is an issue with the permalinks. You can temporarily resolve this by going to your permalinks settings and simply re-saving them.

    For a permanent resolution, I’m afraid you’d have to dig very deep into the code of all active plugins and theme.

    Cheers!

    Thread Starter Marc

    (@arbolife)

    Hi Sybre,

    Saving permalinks is the first thing I did, that did not solve the problem, I still had a 404. Now I created a file manually with the content from your sample. If I remove it, it goes back to a 404 error.

    What should I do now?

    Thanks,
    Marc

    Plugin Author Sybre Waaijer

    (@cybr)

    Hi Marc,

    To reiterate, I’m afraid there’s a theme or plugin causing an issue.

    To find the cause, simply deactivate plugins one-by-one, re-save the permalinks, until it pops up.

    Alternatively, if you’re willing to dive into the code, there should be a plugin active calling the function flush_rewrite_rules() beyond proper requirements.
    What this means is that by doing so it negates all other rewrite rules before they can be initiated; then, for instance, the robots.txt location can’t be saved in the database.
    For more information, see my comments here: https://github.com/sybrew/the-seo-framework/issues/152#issuecomment-306053275

    Nevertheless, the manual output you’ve created is good. It mustn’t change anytime soon, so you’re doing fine with that for a really long time.

    Thread Starter Marc

    (@arbolife)

    Hi Sybre,

    I searched my code for flush_rewrite_rules() and found some plugins using it, however they’re all reputable plugins like WooCommerce, BuddyPress and GeoDirectory… so there isn’t an easy culprit to start an investigation. I’ll leave it with the manual entry. Thanks for your answers and support.

    Best,
    Marc

    Plugin Author Sybre Waaijer

    (@cybr)

    Hi Marc,

    That’s odd. As @lebaux said, are there any security plugins or .htaccess file tweaks that might interfere?

    A solution has already been accounted for, so if you don’t want to follow up directly, let me know! It will save you (and me) some time and when you feel like it, we could always continue on this subject later. 🙂

    Thread Starter Marc

    (@arbolife)

    Hi Sybre,

    No, nothing weird in .htaccess. Let’s drop this topic for now as you and I probably have better things to do than troubleshoot this non-critical issue. I’m happy with a working manual version of the file.

    Cheers,
    Marc

Viewing 8 replies - 1 through 8 (of 8 total)
  • The topic ‘404 on robots.txt’ is closed to new replies.