Support » Plugin: Elevate SEO » Robots.txt

  • Resolved Radices


    Robots.txt is not being created. Enhanced or the template setting seems to do nothing?

Viewing 10 replies - 1 through 10 (of 10 total)
  • Plugin Author Duane Storey



    Can you possibly share your website URL with me? What plugins do you have installed?


    Tried with no plugins active but run 35 in production for gt site. Changed theme to 2019.
    .htaccess file is changed as is wp-config

    Also tried turning on debug log. It doesn’t appear to be working either, can’t find the file, not clear where its supposed to live. Found the file in the plugins folder …Would be nice that info was in the help bubble.

    Manually created a Robots.txt file to see if it would get written to and it does not change at all upon saving options.

    PHP FPM 7.3

    • This reply was modified 1 year, 9 months ago by Radices.
    • This reply was modified 1 year, 9 months ago by Radices.
    Plugin Author Duane Storey



    It’s not a manual file, it’s just generated on demand from within WordPress. In theory any file that doesn’t exist will end up being processed from within WordPress.

    You can see on my site –

    That file doesn’t exist, Elevate is just outputting it via a hook. So it’s strange that’s not working for you. Can you tell me a bit more about your hosting environment? Where is it hosted? Is it Apache or Nginx?


    One of the sites mentioned ( seems to be using Nginx behind Varnish. That’s a bit complex setup. It is still possible to fetch robots.txt dynamically from WordPress.

    Centos 7
    nginx>varnish>apache>php-fpm 7.3

    gtmanagement using WordPress 5.0.3
    jethosted using ClassicPress RC1

    So looks like a permissions issue to hit robots.txt from the backend.

    • This reply was modified 1 year, 9 months ago by Radices.
    Plugin Author Duane Storey


    Are there any PHP errors being generated on the site?

    When I try now I get:

    You don’t have permission to access \@backend on this server.
    Additionally, a 400 Bad Request error was encountered while trying to use an ErrorDocument to handle the request.

    So not sure if it’s because WordPress is throwing an error for robots.txt or something with the varnish config isn’t allowing the request through. I’m happy to debug this, just not sure where to start as on a normal WordPress install robots.txt works fine on all my test installations.

    I can’t find anything definitive in the logs. Looks like the requested url is \@backend. Can’t see if that comes from nginx or WordPress. Coming from nginx and maybe SSL related.

    I can create a robots.txt in 30 seconds and it works fine so I’m fighting the urge to jump down this rabbit hole đŸ™‚ Thanks for showing me the hook I was unaware of it.

    [root@srv1 ~]# varnishlog
    *   << BeReq    >> 71
    -   Begin          bereq 70 fetch
    -   Timestamp      Start: 1550778704.265987 0.000000 0.000000
    -   BereqMethod    GET
    -   BereqURL       \@backend
    -   BereqProtocol  HTTP/1.0
    -   BereqHeader    Host:
    • This reply was modified 1 year, 9 months ago by Radices.

    Ok Solved it … well found the answer here:

    Added this code to the nginx .conf file for the domain. /etc/nginx/conf.d/vhosts

    # robots.txt fallback to index.php
    location = /robots.txt {
    # Some WordPress plugin gererate robots.txt file
        allow all;
        try_files $uri $uri/ /index.php?$args @robots;
        access_log off;
        log_not_found off;
    # additional fallback if robots.txt doesn't exist
    location @robots {
       return 200 "User-agent: *\nDisallow: /wp-admin/\nAllow: /wp-admin/admin-ajax.php\n";
    Plugin Author Duane Storey


    Great! Glad you found the solution – thanks for posting it for others.


    Just to be clear that code ends up falling back to the last function and outputs the two lines whether I have a robots.txt file or not. This prevents the 400 error but in any case the plugins functionality (and WordPress’s) is lost as far as robots.txt is concerned.

Viewing 10 replies - 1 through 10 (of 10 total)
  • The topic ‘Robots.txt’ is closed to new replies.