I'm doing a soft launch for my company's corporate blog and as such, it should be restricted to normal visitors only. As such, I understand that if we click on the Privacy settings "I would like to block search engines, but allow normal visitors" will cause "<meta name='robots' content='noindex,nofollow' />" to be generated into the <head> </head> section (if wp_head is used) of your site's source, causing search engine spiders to ignore your site.
Hence, is there a need to manually create robots.txt in the root folder? Or will it be automatically created once we turn on that setting?
How will the robots.txt be affected once I decided to open my blog to search engines?