WordPress.org

Ready to get started?Download WordPress

Forums

WP Super Cache
Googlebot can't index after caching enabled (5 posts)

  1. lzr0
    Member
    Posted 2 years ago #

    Hi,
    After I enabled caching with most recommended setting (use php, cache rebuild, no CDN support, preload mode 60min), I found my newly uploaded page could not be crawled by google. Webmasters tools showed [under crawl errors] "Redirect error" for that page. I waited a week and the page still was not indexed. Then I tried to run "Fetch as googlebot" and it returned error. I noticed in supercache folder the pages appeared as folders rather than html files (I wonder if it is normal for WP Super cache- WP does add trailing slash to urls)?
    Anyway, I disabled the caching and right away re-run "fetch as googlebot" successfully. Next day I saw that page was in google index. At all this, that page was always accessible to regular visitors- sound weird!
    I am using WP 3.3.1 with suffusion theme, webhosting by GoDaddy, Linux.
    I wonder if anyone has any ideas. Thanks.

    http://wordpress.org/extend/plugins/wp-super-cache/

  2. Donncha O Caoimh
    Member
    Plugin Author

    Posted 2 years ago #

    That redirect error is really weird. I have read of other users who had odd redirect errors but when using mod_rewrite mode and it's very rare.

    Supercache cached files are created in the way you describe - folders with index.html and index.html.gz in them.

    Have you tried accessing your blog as googlebot? You can change your user agent in Firefox and Chrome, or ever use telnet to spoof it. Use Google to find out how..

  3. Joe Siegler
    Member
    Posted 2 years ago #

    I'd use a plugin that generates a Google sitemap, and have it submitted through the Google Webmasters program. That way everything is indexed, and you could probably block Googlebot via robots.txt and save your server the stress. :)

  4. Donncha O Caoimh
    Member
    Plugin Author

    Posted 2 years ago #

    OOOH no! Don't block Googlebot, it still needs to visit the pages even with a sitemap!

  5. lzr0
    Member
    Posted 2 years ago #

    Yes, you can't block googlebot.
    After I let it index the site, I re-enabled the plug-in. Will see if there will be new error reports- maybe the redirect error was caused by the server- I learnt GoDaddy does this sometimes. When I tried to access the site as googlebot from third party tools it does access it normally.

Topic Closed

This topic has been closed to new replies.

About this Plugin

About this Topic