The site is 4 year old. Recently Google started giving warnings about crawl errors. I checked logs. All bots (Google, Yahoo, Bing and the rest) give return code 503.
The robots.txt and .htaccess haven’t been changed for a long time.
The site is fully operational, could be reached at any time and the daily number of visitors remain the same.
By using online tools like “View a Web Page as ‘Googlebot'” I can see the same “server does not respond, HTTP return code: 503’
Definitely, there is some script blocking bots but I can’t find it.
No new plugins got installed recently, disabling all plugins won’t help, bots’ 503 won’t go away.
Interesting, when I place the site URL into a spider spoofer, it reads the main page, but with any other site’s pages or directories, it gives 503, just the same thing when I check the access logs.
Thanks in advance.
- The topic ‘There is something fishy with bots’ is closed to new replies.