Support » Plugin: Yoast SEO » [Plugin: WordPress SEO by Yoast] website blocked by robots.txt

  • A few days ago, after the redesign of the site in question, tools webmaster told me: “Can not communicate with the DNS server.”
    Connectivity server:
    “Googlebot was unable to access the URL because the request timed out or the site is blocking Google. Loading dynamic pages may take too long or your site may be blocking Googlebot.” Tophost in an open ticket, told me that everything was ok.
    Webmaster Tools told me also that I had a robots file that blocked the site.
    Needless to say, there was no robots.txt file on the server, nor on the site wp, but in order to groped to overcome the problem had inserted a file robots with “disallow:”.
    I state that in the construction of the site in wp, I stuck indexing, released at the end of restyling, then 5 days ago.
    In the end I returned further robots.txt file like this:
    User-agent: *

    Disallow: / wp-
    Disallow: / wp-admin /
    Disallow: / wp-includes /
    Disallow: / wp-content /
    Disallow: / cgi-bin /

    Allow: / wp-content/uploads /

    Disallow: / trackback /
    Disallow: / comments /

    not yet endorsed by webmaster tools.
    Urls are blocked.
    The sitemap says that everything has indexed 38 on 38.
    I Yoast WordPress SEO with sitemap activated.
    I thought it was the index block in wp, but it’s been too many days.
    you have any idea or solutions?

  • The topic ‘[Plugin: WordPress SEO by Yoast] website blocked by robots.txt’ is closed to new replies.