PageSpeed Insights
-
- Hello, there is an error with the plugin when you block all the bots. Automatically the “SEO” ranking in Google Speed Insights go down with this directive source “/robots.txt:57:0“
The page I need help with: [log in to see the link]
-
Hi @metajohn
It seems like the robots.txt isn’t directly blocking Google indexing bots. I haven’t been able to replicate the issue on other websites, but it could be related to the way the list is being outputted, which might affect it.
I’ve pushed an update to address this. Could you please let me know if it works for you now?
Thanks,
DaanHi Daan. Thank you for your answer. Now the error changes.
Check please https://spain2100.com/ in https://pagespeed.web.dev/analysis/https-spain2100-com/i44sz7yvov?form_factor=mobile
Now its :
Blocking Directive Source/robots.txt:6:0
It´s related directly with this plugin because before installation SEO was 100. If you can fix it, just answer me and i do the steps you need 😉
Great job anyway!Hi @metajohn
Thanks for the update! I noticed that the issue was likely due to missing Allow rules in the robots.txt. I’ll be pushing an update soon that should fix this.
Could you please try again after the update and let me know if everything works as expected? If there are still any issues, don’t hesitate to reach out!
Thanks for your patience, and I appreciate the feedback!
Best regards,
DaanSure, when you launch a new update i will be right here.
Hi @metajohn, that’s a pitty. However I might have solved the problem. I included DuckDuckBot in the list as it was unclear whether it gathers data for a.i. training or not. Your test doesn’t mention a specific indexing bot blocked, but on an other website this was te case ( strange how https://pagespeed.web.dev/ handles this).
I also did some other stuf that might improve the plugin.
Anyway, to avoid releasing to many versions before there is a solution, I uploade the plugin here:https://codesurf.ie/ai-scrape-protect.zip please note: I didn’t change the version yet.
Could you test and let me know?
Thanks, Daan
Ps. I’m on the road for the rest of the day, I might respond tomorrow.Hello Daan. New update: SEO SCORE still in 69 after i installed your zip
Blocking Directive SourceI have been using MANUS AI PLUS to try to help you….This is what i get :
# Optimized robots.txt version for Spain2100.com # This version resolves the "Blocking Directive Source" error while maintaining anti-scraping protection # Allow access to legitimate Google tools User-agent: Googlebot Allow: / User-agent: Googlebot-Image Allow: / User-agent: Googlebot-News Allow: / User-agent: Google-PageSpeed Allow: / User-agent: Lighthouse Allow: / User-agent: Google-Site-Verification Allow: / # General configuration for other bots User-agent: * # No general restrictions - allows normal indexing # START AI Scrape Protect block - Optimized version # --------------------------- # Block only the most problematic AI bots User-agent: GPTBot Disallow: / User-agent: ChatGPT-User Disallow: / User-agent: CCbot Disallow: / User-agent: anthropic-ai Disallow: / User-agent: Claude-Web Disallow: / User-agent: ClaudeBot Disallow: / User-agent: PerplexityBot Disallow: / User-agent: BardBot Disallow: / User-agent: OAI-SearchBot Disallow: / User-agent: OpenAIContentCrawler Disallow: / User-agent: Bytespider Disallow: / User-agent: cohere-ai Disallow: / User-agent: Diffbot Disallow: / User-agent: Meta-ExternalAgent Disallow: / User-agent: Meta-ExternalFetcher Disallow: / User-agent: Grok Disallow: / User-agent: GrokAI Disallow: / User-agent: XAI Disallow: / User-agent: XBot Disallow: / # --------------------------- # END AI Scrape Protect block # Sitemaps Sitemap: https://spain2100.com/sitemap.xml Sitemap: https://spain2100.com/sitemap.htmlAlso this is an analytic document about your plugin with the SEO PROBLEM:
https://manus.im/share/file/e8cd678c-73fc-4266-b2ce-84368de9d43cNice, you used AI to tackle te problem 😉
Thanks, for your effort. I’ll push an update soon.100% Fixed.
That’s great!
The topic ‘PageSpeed Insights’ is closed to new replies.