Support » Plugin: Virtual Robots.txt » bad robots.txt virtual

  • I have a huge problem for some time, a bad plugin put me a virtual robots.txt;

    User-agent: *
    Crawl-delay: 10

    and I can not destroy this.

    I had put a physical robots.txt, but the result is nil because Google continues not to reference me.

    I just try your plugin, (another before) and nothing changes. I register your

    but my files still indicate

    User-agent: *
    Crawl-delay: 10

    what can i do more ?

    tks
    https://www.artisanathai.com/

Viewing 1 replies (of 1 total)
Viewing 1 replies (of 1 total)
  • The topic ‘bad robots.txt virtual’ is closed to new replies.