Since getting Super Cache going (0.6.6), I am seeing a lot of 403's in the logs for what looks like pretty legit requests. A lot of these are the results of search engine queries, eg from google (also googlebot crawls). This is easy enough to reproduce:
$ curl -i -L http://www.stjames.com/ HTTP/1.1 403 Forbidden Date: Tue, 05 Aug 2008 12:51:23 GMT Server: Apache Vary: Accept-Encoding Content-Length: 269 Content-Type: text/html; charset=iso-8859-1 <!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN"> <html><head> <title>403 Forbidden</title> </head><body> <h1>Forbidden</h1> You don't have permission to access /wp-content/cache/supercache/www.stjames.com//index.html on this server. </body></html>
The referenced file is readable and writable by Apache. Explicitly including any page name, however works just fine, eg by appending 'index.php' works.
The same exact request from Firefox looks fine:
18.104.22.168 - - [05/Aug/2008:08:55:56 -0400] "GET / HTTP/1.1" 200 3293 22.214.171.124 - - [05/Aug/2008:08:55:56 -0400] "GET / HTTP/1.1" 200 3293 "-" "Mozilla/5.0 (X11; U; Linux i686; en-US; rv:126.96.36.199) Gecko/20061201 Firefox/188.8.131.52 (Ubuntu-feisty)"
I have removed all the rejected UA strings, and removed index.php from the rejected filenames, but none of this has helped. Ideas?