Support » Plugin: Virtual Robots.txt » bad robots.txt virtual

  • anniest

    (@anniest)



    I have a huge problem for some time, a bad plugin put me a virtual robots.txt;

    User-agent: *
    Crawl-delay: 10

    and I can not destroy this.

    I had put a physical robots.txt, but the result is nil because Google continues not to reference me.

    I just try your plugin, (another before) and nothing changes. I register your

    but my files still indicate

    User-agent: *
    Crawl-delay: 10

    what can i do more ?

    tks
    https://www.artisanathai.com/

Viewing 1 replies (of 1 total)
  • anniest

    (@anniest)

    i finish by put a real robots.txt

    but i am not sure of the result

Viewing 1 replies (of 1 total)
  • The topic ‘bad robots.txt virtual’ is closed to new replies.