• Resolved rjrjrjrj

    (@rjrjrjrj)


    With this plugin installed and enabled, Lighthouse still reports that it was not able to d/l robots.txt.

Viewing 3 replies - 1 through 3 (of 3 total)
  • Plugin Author Marios Alexandrou

    (@marios-alexandrou)

    I don’t know what Lighthouse is. You can check the existence of a robots.txt file (virtual or physical) by going to the URL domain.com/robots.txt in your browser.

    Thread Starter rjrjrjrj

    (@rjrjrjrj)

    If you are in web development business you should learn what Lighthouse is.
    The problem here is not really related to any specific testing suite but to the fact that even after your plugin was enabled robots.txt still returned 404. After I disabled your plugin and created a physical file it was served successfully. Kind of like that.

    “Friendly” comment 🙂

    Test your robots.txt with the robots.txt Tester 
    
    The robots.txt Tester tool shows you whether your robots.txt file blocks Google web crawlers from specific URLs on your site. For example, you can use this tool to test whether the Googlebot-Image crawler can crawl the URL of an image you wish to block from Google Image Search. 
    https://support.google.com/webmasters/answer/6062598/
    
    You can submit a URL to the robots.txt Tester tool. The tool operates as Googlebot would to check your robots.txt file and verifies that your URL has been blocked properly.
Viewing 3 replies - 1 through 3 (of 3 total)
  • The topic ‘Lighthouse still reports missing robots.txt’ is closed to new replies.