Support » Plugin: The SEO Framework » Robots causing too many script executions

  • Hi SEO Framework team!

    I would like to ask for assistance regarding bots.

    I am currently hosted by fastcomet and saw via their monitor that I have a very high resource usage / script executions which are caused by bots. Which is causing my site to go over the resource limits.

    Here’s a link : https://prnt.sc/nef747

    I have implemented badbots denial via siteguarding.com and just saw recently that the script executions has indeed decreased. I am looking at better plugins for it as well.

    All in SEO I saw has this feature. Does SEO framework have the same? I only saw Homepage Settings – Robots. How can I limit bots effectively using seo framework?

    The main ones say : Warning: No public site should ever disable indexing or following for the homepage.

    The page I need help with: [log in to see the link]

Viewing 6 replies - 1 through 6 (of 6 total)
  • Plugin Author Sybre Waaijer

    (@cybr)

    Hello again πŸ™‚

    I remember we spoke before, on how to help Google and other bots understand your site better. Looking at your request, I believe this paid off; but, I see that your site isn’t handling this all too well (or Autoptimize is experiencing a bug?).

    The crawl-rate should slow down after they know your site’s structure. But, ultimately, it boils down to that the hosting configuration isn’t up to the task. So, I don’t think you should discourage bots. Instead, I recommend contacting your hosting party to see how they might be able to resolve it via upgrading or tweaking PHP and Apache.

    Now, bots often ignore the rate-limiting requests emitted by your site. It is why I haven’t included that feature. However, you are able to control this easily via Bing and Google’s webmaster tools:
    1. https://support.google.com/webmasters/answer/48620?hl=en
    2. https://www.bing.com/webmaster/help/crawl-control-55a30302

    I hope this helps πŸ™‚ Cheers!

    Hi Sybre!

    I’ve always been a bit iffy on autoptimize even before but reinstalled it since it’s the plugin they (fastcomet) recommended. Do you recommended a plugin which works well with seo framework? I used to have wp optimize.

    Currently my lineup is

    Autoptimize
    WP Fastest Cache
    CDN from Cloudfare
    SEO Framework
    Heartbeat Control by WP Rocket

    The end goal is to reduce those damn script executions.

    The reason I posted another topic was because they also suggested I install All In One SEO which has the bad bot blocker (which of course, I didn’t do) instead of SEO framework. Which of course, is not an option for me. I am now considering other bad bot blocker options.

    I just set my crawl rate to the lowest possible setting for google. For bing, I don’t even have an account there and don’t use any of their service at all. Maybe I’ll just block them?

    I hope this works.

    I see that your site isn’t handling this all too well (or Autoptimize is experiencing a bug?).

    @cybr looks like a page cache that has a cached page with references to an Autoptimized file that is not in cache any more due to the AO cache being purged, as https://www.swimbikerun.ph/?bust-that-cache=yes-please-do does work @cdeguzman you might want to purge Fastcomet and/or Cloudflare cache πŸ™‚

    Plugin Author Sybre Waaijer

    (@cybr)

    Thanks for dropping by @optimizingmatters πŸ™‚

    @cdeguzman I recommend consulting with your hosting provider in identifying and blocking the bad bots. They know how to do that via their servers, so you won’t have to mess with plugins.

    If you wish to engage in this, without help from your host, please refer to this tutorial:
    https://perishablepress.com/block-bad-bots/#bbb04

    Now, since you’re connected with CloudFlare, it’s good to point out that they already do most of this for you, and they have tools to control this further.
    https://support.cloudflare.com/hc/en-us/articles/200171416-How-do-I-block-bots-and-crawlers-

    Hi @cybr and @optimizingmatters

    Thank you very much for the inputs.

    As of no :

    1. I have lowered the crawl rate of google and set it to the minimum amount.
    2. I have purged cache of cloudfare
    3. Should I purge autoptimize cache?
    4. Here’s a screenshot of the bots crawling my site

    https://prnt.sc/nf7cjh

    As for bing, would you recommend that I totally block it? I don’t use bing and don’t even have an account there. Also clients here in the Philippines I doubt uses bing.

    Fastcomet (my hosting) just recommended that I block ALL bots before they mark the issue as resolved, which of course, I’m trying not to do. So far they already took down my site once due to max script executions but have given me till the end of the month as leeway to fix it.

    Plugin Author Sybre Waaijer

    (@cybr)

    Hello again!

    You’d want to clear Autoptimize cache first, and then Cloudflare. Try visiting your site from another browser (while logged out), so you can affirm the caching issues we’re experiencing.

    The main offending bots are from Google, Bing, Facebook, and Ahrefs. All of their bots (but the last one) are necessary for your site to perform well on social networks and search engines; are you sure you wish to stick to your current hosting provider if they can’t even handle these?

    Nevertheless, you may wish to adjust your Cloudflare settings to “Standard/Aggressive caching”, which includes query strings. You’d also want to look into their static HTML caching; which will significantly resolve your hosting issues.
    https://support.cloudflare.com/hc/en-us/articles/200172256-How-do-I-cache-static-HTML-

    Moreover, I recommend signing up for Bing’s Webmaster Tools if you wish to control their crawling speed. Bing’s data is used by many search engines, including DuckDuckGo and Yahoo! Search. So, no, don’t block Bingbot! πŸ™‚

Viewing 6 replies - 1 through 6 (of 6 total)
  • The topic ‘Robots causing too many script executions’ is closed to new replies.