• Resolved Anonymous User 14978628

    (@anonymized-14978628)


    Hey Jeff,

    On your website you list full compatibility with WP Rocket and say that no special setup is required.

    I can’t get myself banned when visiting the blackhole link once pages are cached, but i can get banned if i visit an uncached page and then visit the blackhole link.

    Is there something i’m missing to get this setup properly? I’ve added /?blackhole in the WP Rocket exclusion section, so not sure what else to do…

Viewing 15 replies - 1 through 15 (of 23 total)
  • Plugin Author Jeff Starr

    (@specialk)

    Hi Marty,

    I’ll ask the dev who did the reporting/testing with WP Rocket to see if there is anything special that is required. I know that with other caching plugins, the problem is with page caching. Blackhole uses WP init hook to check requests. But when pages are cached, WP/hooks generally are not called, so plugins like Blackhole cannot do their work.

    Thread Starter Anonymous User 14978628

    (@anonymized-14978628)

    Ok, please let me know what they say. Issue only occurs with cached pages. Non cached pages i can get banned. Just tested on my local install and was able to replicate the problem.

    I was going by this compatibility list:

    https://plugin-planet.com/blackhole-pro-cache-plugins/

    Plugin Author Jeff Starr

    (@specialk)

    Okay so it turns out that WP Rocket’s page caching and Blackhole are incompatible. All other WP Rocket features work fine with Blackhole. I have updated the compatibility list accordingly. Thank you for reporting this, and apologies for any inconvenience.

    Thread Starter Anonymous User 14978628

    (@anonymized-14978628)

    Hey Jeff,

    Got a reply back from WP Rocket support. Here’s what they said in case it’s of any help:

    > the plugin requires i add the following as a url exclusion ?blackhole= or ?blackhole but anytime i try to add in something with ? in the exclusions in WP Rocket it disappears from the never cache box when i save it.

    By default, WP Rocket does not cache URLs which contain query strings in them; that means a URL that looks like this:
    https://example.com/?blackhole=1234567890

    would be excluded from WP Rocket’s caching automatically — unless you specifically tell WP Rocket that you want to cache those URLs:
    http://docs.wp-rocket.me/article/971-caching-query-strings

    > Blackhole uses WP init hook to check requests. But when pages are cached, WP/hooks generally are not called

    WP Rocket — like most caching plugins — eliminates all PHP/MySQL processes (which includes WordPress hooks) from a page and creates static HTML output instead.

    So if Blackhole doesn’t add the hidden link to your pages until it detects that a bot has requested the page, then that could explain why the link isn’t present — but it wouldn’t really explain this:

    > The issue is that once i visit the blackhole links created by the plugin on cached pages, i can’t get banned like i’m supposed to.

    Since those links shouldn’t be cached in the first place, I’m struggling to see how WP Rocket could be preventing them from working properly when visited.

    Could you give me your thoughts on WP Super Cache. Is it safe to use this plugin?

    Plugin Author Jeff Starr

    (@specialk)

    Thanks for sharing that information. I believe I’ve already explained several times now why page caching (like with WP Rocket) is incompatible with Blackhole; here it is again just in case:

    “There are many types of cache plugins. They provide all sorts of different caching mechanisms and features. All caching features work great with Blackhole except for “page caching”. With page caching, the required init hook may not be fired, which means that plugins like Blackhole for Bad Bots are not able to check the request to see if it should be blocked. Fortunately, two of the most popular caching plugins provide settings that enable full compatibility with Blackhole.”

    This is taken from: https://plugin-planet.com/blackhole-pro-cache-plugins/

    So it’s not about the hidden link, but rather about the fact that the Blackhole script does not have a way to check the request when page caching is enabled. Because PHP/MySQL is not available when the plugin is serving static HTML copies of your WP pages. So there is no way to check each request to see whether or not it should be blocked. Think of it this way: WordPress plugins do not work when WordPress is not available; and when you use page caching, that’s essentially the case.

    As for WP Super Cache, all I am liberty to say is that it has been tested and confirmed to be compatible when “Late init” is enabled.

    Thread Starter Anonymous User 14978628

    (@anonymized-14978628)

    Hey Jeff,

    Sure, I understood why Blackhole doesn’t work with page caching 😉 Just thought it may have been helpful to post the response.

    Plugin Author Jeff Starr

    (@specialk)

    Absolutely it is helpful and appreciated. Just want to make sure that everyone is on the same page with the underlying issue. I am keeping my eyes and ears open for any possible solutions for page-cache plugins like WP Rocket that don’t provide any sort of “late init” functionality (like WP Super Cache, W3 Total Cache, et al).

    Thread Starter Anonymous User 14978628

    (@anonymized-14978628)

    Sorry, just out of interest. Does the manual way to use Blackhole differ from the plugin way in terms of compatibility with caching plugins?

    Plugin Author Jeff Starr

    (@specialk)

    Great idea but no, both the PHP script and WP plugin rely on dynamically checking the incoming request (whether it should be blocked or not). Just not possible when PHP is not available (i.e., PHP functionality requires PHP in order to work).

    Thread Starter Anonymous User 14978628

    (@anonymized-14978628)

    Yeah, just sort of figured that out. When testing Super Cache they have two caching methods, one that uses php and the other that uses mod_rewrite. The latter doesn’t work with Blackhole obviously for the reason you just specified.

    Do you know if there is that much of a performance hit using php as opposed to mod_rewrite if using php7?

    Plugin Author Jeff Starr

    (@specialk)

    In general, it’s best to handle traffic at the server level; that prevents resources from being used for PHP, MySQL, etc.

    Thread Starter Anonymous User 14978628

    (@anonymized-14978628)

    That’s the biggest drawback of using Blackhole as you have to sacrifice site speed for security.

    I couldn’t find a Blackhole in the source on PerishablePress. Are you using Blackhole on your website? It seems quite odd not to be using your own plugin!

    Plugin Author Jeff Starr

    (@specialk)

    That’s the biggest drawback of using Blackhole as you have to sacrifice site speed for security.

    Only if your site is slow and requires a page-caching plugin. Also the converse could be argued: the biggest drawback of page caching is that it renders many plugins useless.

    I couldn’t find a Blackhole in the source on PerishablePress. Are you using Blackhole on your website? It seems quite odd not to be using your own plugin!

    I handcraft my site security based on context and traffic. If you browse thru the Perishable Press archives, you will find that I’ve developed hundreds of security plugins and techniques over the past 12+ years. It would be unnecessary and silly to implement all of them on any one site.

    Thread Starter Anonymous User 14978628

    (@anonymized-14978628)

    Is it at all feasible for Blackhole to work without php being required? I know it isn’t now, i’m just wondering if it’s at all possible sometime in the future?

    The incompatibility with page caching plugins really limits the potential users of this plugin, which is a shame because it’s a really good and effective idea!

    Plugin Author Jeff Starr

    (@specialk)

    Yeah tell me about it. I wish page caching plugins were never necessary. In any case, when you ask if it’s “feasible for Blackhole to work without php being required”, you essentially are asking if it is possible to dynamically block bad bots using static HTML content (which is what is generated by page caching plugins). And as discussed previously, the answer to that question is “no”, not possible. HTML is static, which means it displays content and that’s it. For anything dynamic, like checking for bad bots, you need PHP or some other scripting or programming language.

Viewing 15 replies - 1 through 15 (of 23 total)
  • The topic ‘WP Rocket’ is closed to new replies.