WordPress.org

Ready to get started?Download WordPress

Forums

BulletProof Security
[resolved] Blocking comment spam (25 posts)

  1. Bunzer
    Member
    Posted 1 year ago #

    I installed BPS and it seems to be working well. Until today, when I noticed that a spam comment had a previously blocked IP.

    Everything else seems to be okay in that section - user agents are being blocked correctly. I added myself to the comment spam blacklist, and was able to post okay.

    I haven't changed anything in that block, apart from adding a few more ranges (and myself). Any suggestions as to how to debug this problem?

    http://wordpress.org/extend/plugins/bulletproof-security/

  2. AITpro
    Member
    Plugin Author

    Posted 1 year ago #

    Check your root .htaccess file to make sure the IP blocking htaccess code is actually in your root .htaccess file.

    With the additions of the new Custom Code text areas/text boxes you can now add (should add) all your IP blocking code to the...

    CUSTOM CODE BOTTOM HOTLINKING/FORBID COMMENT SPAMMERS/BLOCK BOTS/BLOCK IP/REDIRECT CODE: Add miscellaneous code here
    ONLY add valid htaccess code below or text commented out with a pound sign #

    ...text area/text box, save your custom code, go to the Security Modes page, click the Create secure.htaccess File AutoMagic button and activate Root folder BulletProof Mode again.

    By adding all of your custom IP blocking code to this Custom Code text box you can continue to build it up/add to it since it is saved permanently and then repeat the steps above each time you edit your custom code.

  3. AITpro
    Member
    Plugin Author

    Posted 1 year ago #

    For folks who have a BuddyPress site and are getting hit hard by comment spammers/comment spammer registrations like we once were - 1,500+ per day then these solutions in the link below now allow ONLY 1-2 comment spammer registrations per day.

    http://forum.ait-pro.com/forums/topic/buddypress-spam-registration-buddypress-anti-spam-registration/

  4. Bunzer
    Member
    Posted 1 year ago #

    Everything is in place. It's what I have in the last block which seems to be the problem. I think the problem may be with my implementation of the Files/FilesMatch containers. Here is what it looks like (edited for brevity).

    <FilesMatch "^(wp-comments-post\.php)">
    Order Allow,Deny
    Deny from 46.119.35.
    Deny from 46.119.45.
    -etc-
    Allow from all
    </FilesMatch>

    BrowserMatch ^-?$ badrobot
    BrowserMatch Ahrefs badrobot
    -etc-
    <FilesMatch ".*">
    Order Allow,Deny
    Deny from env=badrobot
    Deny from 5.45.202.0/24
    -etc-
    Allow from all
    </FilesMatch>

    <FilesMatch "(robots\.txt|favicon\.ico|403\.php)">
    Order Allow,Deny
    Allow from all
    </FilesMatch>

    I removed the last section and the problem continued, so I'm guessing it's the middle section which is cancelling out the first.

    I have done it this way because I wanted a neat way of issuing the BPS custom 403 even though access was denied (causing a double 403).

    Any alternative methods suggested would be greatly accepted, as I'm not great at this stuff.

  5. Bunzer
    Member
    Posted 1 year ago #

    I think I got it working...

    <FilesMatch "(robots\.txt|403\.php)$">
    Order Allow,Deny
    Allow from all
    </FilesMatch>

    <FilesMatch "^(wp-comments-post\.php)">
    Order Allow,Deny
    Deny from 46.119.35.
    -etc-
    Allow from all
    </FilesMatch>

    BrowserMatch ^-?$ badrobot
    BrowserMatch Ahrefs badrobot
    -etc-
    Order Allow,Deny
    Deny from env=badrobot
    Deny from 5.45.202.0/24
    -etc-
    Allow from all

  6. AITpro
    Member
    Plugin Author

    Posted 1 year ago #

    Search Engines need to access robots.txt and a 403.php template would need to be accessed by Browsers to process 403 errors. You can just delete that code because you are saying to allow access to everyone and not block them so it is the same thing as doing nothing.

  7. Bunzer
    Member
    Posted 1 year ago #

    Without that bit of code, I was getting a default 403, rather than the custom page because trying to display the custom 403 gave another 403! :-D

  8. AITpro
    Member
    Plugin Author

    Posted 1 year ago #

    ErrorDocument 403 is an htaccess redirect directive that should point to your 403.php template file. If that is not working correctly then something is interfering with that directive.

    Example of what you should see in your root .htaccess file:
    ErrorDocument 403 /wp-content/plugins/bulletproof-security/403.php

    points to the BPS 403.php template file.

  9. Bunzer
    Member
    Posted 1 year ago #

    I saw that, and I fully understand what you're saying.

    What I'm saying is that, by blocking an IP or user agent from the website, it is unable to display the 403.php you described, and the server issues a double 403 - i.e. it issues a custom 403 which is blocked causing another 403.

    I added the above code so that even a blocked computer could at least access robots.txt and the custom 403 page.

  10. AITpro
    Member
    Plugin Author

    Posted 1 year ago #

    You should not have to use this code below so I do not understand what exactly is causing the problem/wrong on your particular site??? What happens when you comment out this code for testing?

    <FilesMatch "(robots\.txt|403\.php)$">
    Order Allow,Deny
    Allow from all
    </FilesMatch>
  11. AITpro
    Member
    Plugin Author

    Posted 1 year ago #

    What is supposed to normally happen is this.

    All 403 Forbidden errors are redirected by the ErrorDocument directive when the 403 error occurs. This should only generate 1 403 error in your Security Log.

  12. Bunzer
    Member
    Posted 1 year ago #

    It may be something I added to the last block. I haven't got any more time this weekend, but I can have a go on Monday, to check.

  13. Bunzer
    Member
    Posted 1 year ago #

    This with above code hidden...

    Forbidden

    You don't have permission to access /yorkshire/cawood/ on this server.

    Additionally, a 403 Forbidden error was encountered while trying to use an ErrorDocument to handle the request.

  14. AITpro
    Member
    Plugin Author

    Posted 1 year ago #

    Wow very strange. Did you click the AutoMagic buttons before activating BulletProof Modes?

  15. AITpro
    Member
    Plugin Author

    Posted 1 year ago #

    Oh wait a minute this code does not look valid.

    <FilesMatch ".*">
    Order Allow,Deny
    Deny from env=badrobot
    Deny from 5.45.202.0/24
    -etc-
    Allow from all
    </FilesMatch>

    You actually need to do something like this.

    <FilesMatch "^\.(php|js|css)$">
    Order Allow,Deny
    Deny from env=badrobot
    Deny from 5.45.202.0/24
    -etc-
    Allow from all
    </FilesMatch>
  16. AITpro
    Member
    Plugin Author

    Posted 1 year ago #

    Also I have run into problems using CDIR IP blocking .0/24 and have found that just adding the dot (.) is more reliable across different hosts. And on some web hosts adding a CIDR causes 403 errors.

    Deny from 5.45.202.

  17. Bunzer
    Member
    Posted 1 year ago #

    I don't think there's a problem with CIDR notation. I checked the Apache 2.2 documentation. All my blocking rules seem to work under test, so I'm happy it's all good now.

    I think I am introducing the custom 403 block by filling out the badbots section with blanket rules. Ideally, I wanted to say "block everything except robots.txt and 403.php", but you can't do negatives in the Files/FilesMatch container, as far as I know.

    Apparently, FilesMatch ".*" is equivalent to no FilesMatch container at all, so I removed it. Adding the separate FilesMatch for the two file exclusions is enough to stop them being blocked by the following blanket rules. I tested this, and I get a nice 403 in a black-bordered box.

    If you are interested, I can provide the code I settled upon. It might be worth adding it to your default, as it gives a good starting point for blanket blocking harvesters and IP ranges.

  18. AITpro
    Member
    Plugin Author

    Posted 1 year ago #

    Here's the Apache Core link for 2.4 that I frequently use since it has all the directives nicely indexed and linked on the right hand side of the page. Bookmark keeper. ;)
    http://httpd.apache.org/docs/current/mod/core.html

    IMPORTANT NOTE: The Context section of the directive description states where the particular directive can be used. See the Context link below for what these mean: server config, virtual host, directory, .htaccess.
    http://httpd.apache.org/docs/current/mod/directive-dict.html#Context

    Always interested in what folks have come up with for their personal custom htaccess code so either use pastebin if you want to post/share it in the WP forum or if you want to post it in the BPS Forum then you can post all the code in a new forum topic.
    http://forum.ait-pro.com/read-me-first/

  19. MickeyRoush
    Member
    Posted 1 year ago #

    While searching for something else I came across this thread.

    Bunzer wrote:

    but you can't do negatives in the Files/FilesMatch container, as far as I know.

    You can use negative assertion, but why would you when you can whitelist like this:

    Order Allow,Deny
    <FilesMatch "^(robots\.txt|403\.php)$">
    Allow from all
    </FilesMatch>

    That would deny HTTP access to everything but those files listed.

    First, all Allow directives are evaluated; at least one must match, or the request is rejected. Next, all Deny directives are evaluated. If any matches, the request is rejected. Last, any requests which do not match an Allow or a Deny directive are denied by default.

    http://httpd.apache.org/docs/2.2/en/mod/mod_authz_host.html#order

    But that was more or less designed to be used on a per directory basis and probably not in the root of a site.

    If I were going to try and stop spam whether on login, registration, and/or commenting I would put in place rules that account for headers that are normally used by humans and not by bots (the more the better).

    This is a basic one, but can be made stronger and configured better to your users/setup.

    # AntiSpam for Comments
    RewriteCond %{HTTP_REFERER} !^https?://([^.]+\.)?example\.com/ [NC,OR]
    RewriteCond %{THE_REQUEST} !HTTP/1\.1$ [NC,OR]
    RewriteCond %{HTTP:Connection} !^keep-alive$ [NC,OR]
    RewriteCond %{HTTP:Accept-Encoding} !^gzip [NC,OR]
    RewriteCond %{HTTP:Accept-Language} ^.?$ [OR]
    RewriteCond %{HTTP_USER_AGENT} ^(.{0,49}|.{299,})$ [OR]
    RewriteCond %{HTTP_ACCEPT} ^.?$
    RewriteRule wp-comments-post\.php http://example.com/ [R=301,L,NS]

    (Where example.com is your site.)

    Again these are basics for normal headers. There are circumstances where these are too strong. You'd have to test for yourself. For example, any user agent that is less than 50 characters or more than 299 are send to your home page. You could actually make the Language header only be for English if you want.

    RewriteCond %{HTTP:Accept-Language} !en [NC,OR]

    To me, any server protocol less than 1.1 is way out of date, even though there are some circumstances like schools and proxies which are configured so that is something that may need to be taken into consideration as well.

    Can all of this be spoofed? Why yes. But not many bots account for all. Requiring a cookie would be a good idea as well.

  20. AITpro
    Member
    Plugin Author

    Posted 1 year ago #

    I have experimented with this HTTP Header line of code below (is not HTTP/1.1) in the past and it has not worked correctly for me. I have not picked apart why exactly that is, but have some logical ideas to go on. When I have some time I will look at it deeper.

    RewriteCond %{THE_REQUEST} !HTTP/1\.1$ [NC,OR]

    Using the matching condition (is HTTP/1.0) for bad bots, proxies, etc. is very effective on the other hand.

    # Protect wp-login.php from Brute Force Login Attacks based on Server Protocol
    # All legitimate humans and bots should be using Server Protocol HTTP/1.1
    RewriteCond %{REQUEST_URI} ^/wp-login\.php$
    RewriteCond %{THE_REQUEST} HTTP/1\.0
    RewriteRule ^(.*)$ - [F,L]
  21. AITpro
    Member
    Plugin Author

    Posted 1 year ago #

    This approach is for those folks who DO NOT allow anyone else to log into their websites. ie development, testing or sites where users are not allowed to register and comment.

    # Protect wp-login.php from Brute Force Login Attacks
    <FilesMatch "^(wp-login\.php)">
    Order Allow,Deny
    # Add your website domain name
    Allow from example.com
    # Add your website/Server IP Address
    Allow from 69.200.95.1
    # Add your Public IP Address using 2 or 3 octets so that if/when
    # your IP address changes it will still be in your subnet range. If you
    # have a static IP address then use all 4 octets.
    # Examples: 2 octets: 65.100. 3 octets: 65.100.50. 4 octets: 65.100.50.1
    Allow from 65.100.50.
    </FilesMatch>
  22. MickeyRoush
    Member
    Posted 1 year ago #

    I believe that THE_REQUEST is basically this:

    REQUEST_METHOD
    REQUEST_URI (not decoded)
    SERVER_PROTOCOL

    In that order. So basically this:

    RewriteCond %{THE_REQUEST} !HTTP/1\.1$ [NC,OR]

    Means not SERVER_PROTOCOL HTTP/1.1

    I believe I got that from jdMorgan. Can't remember at the moment, but I've been using it for quite a long time. It maybe be safer to just use:
    RewriteCond %{THE_REQUEST} HTTP/1\.0
    or:
    RewriteCond %{SERVER_PROTOCOL} HTTP/1\.0

    I imagine the later would be slightly faster.

    The thing I like about using certain HTTP headers, is that you can also include junk like this from commenting or whatever (but will block some mobile and satellite users as well):

    RewriteCond %{HTTP:X_FORWARDED_FOR} !^$ [OR]
    RewriteCond %{HTTP:VIA} !^$ [OR]

    So:

    RewriteCond %{HTTP:X_FORWARDED_FOR} !^$ [OR]
    RewriteCond %{HTTP:VIA} !^$ [OR]
    RewriteCond %{HTTP_REFERER} !^https?://([^.]+\.)?example\.com/ [NC,OR]
    RewriteCond %{THE_REQUEST} !HTTP/1\.1$ [NC,OR]
    RewriteCond %{HTTP:Connection} !^keep-alive$ [NC,OR]
    RewriteCond %{HTTP:Accept-Encoding} !^gzip [NC,OR]
    RewriteCond %{HTTP:Accept-Language} ^.?$ [OR]
    RewriteCond %{HTTP_USER_AGENT} ^(.{0,49}|.{299,})$ [OR]
    RewriteCond %{HTTP_ACCEPT} ^.?$
    RewriteRule wp-comments-post\.php http://example.com/ [R=301,L,NS]
  23. AITpro
    Member
    Plugin Author

    Posted 1 year ago #

    Yeah I'm sure the condition match works either way so my guess is that i forgot to use the starting slash in this URI rule: ^wp-login\.php$ and then fixed it later. ;)

    Yep, the whole mobile scene is booming now and tricky and changing very fast so for now I am putting any mobile or similar .htaccess code on hold until things stabilize.

  24. AITpro
    Member
    Plugin Author

    Posted 1 year ago #

    Resolved.

  25. Bunzer
    Member
    Posted 1 year ago #

    Uninstalled. Too much work to maintain.

Topic Closed

This topic has been closed to new replies.

About this Plugin

About this Topic

Tags

No tags yet.