• Resolved vicksterm

    (@vicksterm)


    I ticked the “Discourage search engines from indexing this site” box under Settings > Reading during development. Today we are ready to go live so I unticked the box. Robots.txt still shows disallow all. I read online that uploading your own file will override the virtual file, so I upload one to root that reads:

    User-agent: *
    Disallow:

    yet the url http://arteriors.wpengine.com/robots.txt still shows

    User-agent: *
    Disallow: /

    I cleared browser caches, purged WPEngine caches and still it won’t change. I don’t see any plugins that could interfere except Yoast SEO, but it seems to show the uploaded file which allows all.

    Any ideas what’s going on here and how I can get the file to update?

Viewing 1 replies (of 1 total)
  • Thread Starter vicksterm

    (@vicksterm)

    Found out that WPEngine keeps the disallow rule for all temporary urls, but will follow the file on the server when the site is live.

Viewing 1 replies (of 1 total)
  • The topic ‘Robots.txt doesn't change after unticking Discourage search engines box’ is closed to new replies.