• Resolved haltek18

    (@haltek18)


    How do I unblock my website from Google within the theme? In my reading settings in the backend, I don’t have the checkmark box checked to block it from indexing on Google, so I’m not sure what’s happening.

    It started happening Novemeber 30th. I’m using Yoast SEO plugin to generate the site maps but it won’t let me re-submit them in Google Search Console. I have errors.

    The error is:

    “Sitemap can be read, but has errors
    General HTTP error
    1 instance
    We encountered an error while trying to access your Sitemap. Please ensure your Sitemap follows our guidelines and can be accessed at the location you provided and then resubmit.”

    I also have a robots.txt file in my public_html folder so it allows files to be read.
    Not sure what’s happening! Please help.

    https://support.google.com/webmasters/answer/7489871?hl=en

    Do I need to re-submit and regenerated from Yoast SEO?
    I’m not sure what else to do???

    My robots.txt file says this:

    User-agent: *
    Disallow:

    The page I need help with: [log in to see the link]

Viewing 4 replies - 1 through 4 (of 4 total)
  • We checked on your robots.txt file at https://www.blenkerco.com/robots.txt and currently this contains the following code which instructs all user agents to not crawl the site:

    User-agent: * 
    Disallow: /

    We would recommend removing this directive if you want your site to be crawled and we have more information on using the robots.txt file here: https://yoast.com/ultimate-guide-robots-txt/

    In regards to the sitemap, can you please let us know what the value of the sitemap URL you have submitted in Google Search Console?

    Thread Starter haltek18

    (@haltek18)

    Hi,

    The funny thing is, what’s written in my robots.txt is different from what’s visible in the robots.txt file. What’s visible in the robots.txt file is NOT what I want it to say. What’s written in my file is this:

    # Rule 1
    User-agent: Googlebot
    Disallow: /nogooglebot/
    
    # Rule 2
    User-agent: *
    Allow: /

    I’m not sure WHY it’s saying this:

    User-agent: * 
    Disallow: /

    Is it possible that something is interrupting it, such as another plugin, my CDN or something in my DNS settings?

    What I submitted into the sitemap is this:

    /slides_category-sitemap.xml
    /testimonials_category-sitemap.xml
    /portfolio_category-sitemap.xml
    /post_tag-sitemap.xml
    /category-sitemap.xml
    /popup-sitemap.xml
    /portfolio_page-sitemap.xml
    /page-sitemap.xml
    /post-sitemap.xml

    Thread Starter haltek18

    (@haltek18)

    For example: http://www.blenkerco.com/robots.txt shows something completely different from blenkerco.com/robots.txt

    Thread Starter haltek18

    (@haltek18)

    Nevermind. I found out that my CDN was blocking my website with that script. It’s all resolved now.

Viewing 4 replies - 1 through 4 (of 4 total)

The topic ‘Robots.txt file blocking my website?’ is closed to new replies.