• Resolved ultimateuser

    (@ultimateuser)


    Hi there,

    I use subsites to create portals for my clients. Basically I’ve setup several sites: domain.som/site1, domain.com/site2 etc.

    I would like to hide the complete content of these sites from Google and other search engines.

    I believe there is a way to create a robots.txt and use some coding so they won’t crawl these subsites.

    Does anyone know the code and where to place the file on my server?

    Many thanks

Viewing 3 replies - 1 through 3 (of 3 total)
  • You can do this in an easy manner. Under settings menu,you find this option “Allow search engines to index this site”. Simply uncheck it

    Thread Starter ultimateuser

    (@ultimateuser)

    Yes I’ve done that already, but wasn’t sure that will be sufficient.

    Will it – or do I also need to use robots.txt?

    Moderator Ipstenu (Mika Epstein)

    (@ipstenu)

    🏳️‍🌈 Advisor and Activist

    That setting controls robots.txt

    If search engines ignore it, nothing you can do to stop ’em. But that should be enough.

Viewing 3 replies - 1 through 3 (of 3 total)
  • The topic ‘Hide subdomain from search engines’ is closed to new replies.