• Resolved happynaturaldog

    (@happynaturaldog)


    Hi there, I installed the black hole plugin and added the recommend text to the robots.txt file.Today I received a notice from Google search console saying “Indexed, though blocked by robots.txt.

    What does this mean, and is it preventing Google from reporting/tracking my website?

    Thanks,
    Karen

    The page I need help with: [log in to see the link]

Viewing 4 replies - 1 through 4 (of 4 total)
  • Plugin Author Jeff Starr

    (@specialk)

    It usually means that the site has some sort of cache plugin active. So that would be the first thing to check. As explained in the documentation, specifically *page caching* prevents Blackhole from working properly.

    Plugin Author Jeff Starr

    (@specialk)

    Hey Karen, just wanted to follow up with this. It’s been awhile with no reply, so I hope the issue is resolved? Or if there is anything I can do to help, please let me know. Thank you.

    Thread Starter happynaturaldog

    (@happynaturaldog)

    Hi Jeff,

    Thanks for following up. I contacted WP Rocket (where the page caching is occurring) and it sounds like if I disable the page caching the speed of my website will be negatively affected, so I may need to choose one or the other. I’m not versed in the technical side of things, but here is what they explained to me:
    “To be compatible with page caching, the plugin would need to rely on a “cache-friendly” method such as JavaScript/AJAX, instead of relying on PHP”

    I’m assuming Black Hole uses PHP (whatever that is haha!) and that plus the WP Rocket page caching is causing the Google Search Console error?

    Thanks again,
    Karen

    Plugin Author Jeff Starr

    (@specialk)

    Hi Karen,

    Thanks for following up. For this:

    “I’m assuming Black Hole uses PHP (whatever that is haha!) and that plus the WP Rocket page caching is causing the Google Search Console error?”

    Yes, as explained on the plugin settings page, the plugin home page, documentation, support forum, and elsewhere, Blackhole is not (yet) compatible with page caching. Why? Because page caching effectively disables PHP, so that dynamic plugins like Blackhole are not able to work properly.

    And for this:

    “To be compatible with page caching, the plugin would need to rely on a “cache-friendly” method such as JavaScript/AJAX, instead of relying on PHP”

    I am working on a possible solution to the page-cache dilemma. In the meantime, there are several good caching plugins that provide a “late init” feature that enables compatibility with Blackhole. Here is a post that explains more.

    I hope this helps. Let me know if any further questions, feedback, etc. Glad to help anytime.

Viewing 4 replies - 1 through 4 (of 4 total)
  • The topic ‘Google Search Console Blocked’ is closed to new replies.