Support » Plugin: Blackhole for Bad Bots » Recent changes by Google in rel nofollow attribute treatment

  • Resolved n381

    (@n381)


    Hello, I’ve been using your plugin as additional protection from some non wanted traffic and it was functioning fine in past, until recently when one of the web sites undergo a huge drop in Google index with 90% pages excluded from the index.

    I’m about to ask whether there are chances that changes Google set active recently from the March 1st for “nofollow” links to be treated rather as “hint” and not necessary as rule – are meaning the links are *partly followed and possible that “wanted” crawlers are getting trapped/blocked, so would that mean that BHBB is not operational by this method, leaving it only with robots.txt check of crawlers rule abidance ?

    *

    On Tuesday 10 September Google announced that it was changing the way it views the “nofollow” link attribute.
    When it comes to ‘nofollow’ links, there is no more ‘one rule fits all’. Instead, the search engine will be looking for ‘hints’ for link attribution

    • This topic was modified 2 months, 2 weeks ago by n381.
Viewing 4 replies - 1 through 4 (of 4 total)
  • Plugin Author Jeff Starr

    (@specialk)

    Google no longer honoring nofollow links (after years of them forcing it upon everyone) has no effect on this particular case. Why? Because the plugin excludes/whitelists all Google-related bots, so they never are blocked ever.

    More likely what is happening is that you have some sort of page-caching plugin or service that is preventing Blackhole from operating correctly. That would be the first thing to check, along with this post for more information.

    Plugin Author Jeff Starr

    (@specialk)

    Hi @n381, just wanted to follow up with this. It’s been a few days I hope the issue is resolved? Or if there is anything I can do to help, please let me know. Thank you.

    n381

    (@n381)

    Hey Jeff, and thank you for clarifying. I can confirm that WP version Blackhole for Bad Bots is fully functioning no active websites problems in indexing presently. In addition have to ask, is a set of ok-bots exclusion/whitelisting also provided in the standalone version downloadable from perishablepress?

    One specific non-WP e-shop site I’m working with, till recently had huge problems since March 1st’ introduction of Google new rel “non follow” to partly follow practice, and since for weeks I’ve been examining occasional drop and blocking of some of the Google bots/services. For example, that particular website was accessible with the above average scoring at Google Page Speed insights, while Google mobile friendly testing service had problems reporting non accessible page.
    Google WMT has been providing vary results, for days pages in index that again out with notifying as 500 inaccessible server error.
    Also to mention that during all the trial period e-shop is under everyday bots probing from aws to cn bots, content scrapers and hacking bot nets.

    Thank you

    • This reply was modified 2 months ago by n381.
    • This reply was modified 2 months ago by n381.
    • This reply was modified 2 months ago by n381.
    • This reply was modified 2 months ago by n381.
    Plugin Author Jeff Starr

    (@specialk)

    Hi @n381,

    Thanks for the related infos re: Google et al. Very interesting.

    For this:

    “In addition have to ask, is a set of ok-bots exclusion/whitelisting also provided in the standalone version downloadable from perishablepress?”

    Yes the standalone/PHP version of Blackhole includes the same set of whitelisted bots.

    Let me know if I can provide any further infos, glad to help anytime.

Viewing 4 replies - 1 through 4 (of 4 total)
  • You must be logged in to reply to this topic.