• Resolved phineas137


    Is normal that attachment pages now show a message that say the image can’t be loaded because contains errors?

    Is normal that Google Search Console say that the sitemaps is being blocked by robots.txt?


Viewing 4 replies - 1 through 4 (of 4 total)
  • Plugin Author Joost de Valk


    Being blocked by robots.txt is certainly not normal, and not something we have anything to do with, so that you’ll need to check your own robots.txt for.

    As for the image can’t be loaded: that seems like an error, do you have an example URL?

    Thread Starter phineas137


    Before this update I had used a temporary solution blocking all the attachments through the file robotx.txt and deleting them with Search Console. This morning installed the update, edited the robots file and used the option to notify Google of a new file to update it. Even now the indexing error keeps appearing, but my in live monitoring reports that Google is crawling attachment, so…

    This is an example of the error in the images, only happens with some, not all, it may be the ones that I had manually de-indexed.

    A black page and a message that say the image can’t be displayed because it contains errors.

    • This reply was modified 2 years, 10 months ago by phineas137.
    Thread Starter phineas137


    This morning I saw that google bots are crawling my site in excess and it overload the server


    Plugin Support Patrick Kuijpers


    In Google Search Console you can set a limit to the requests Google can make per second. Normally we suggest letting Google figure this out for themselves but if it really impacts your site, you might want to set a limit.

    You can set the limit by going into the Google Search Console property you want to set a limit for. Click the gear icon in the top right corner en select ‘site settings’ and then change the crawl rate settings to your liking.

Viewing 4 replies - 1 through 4 (of 4 total)
  • The topic ‘Doubts’ is closed to new replies.