That went well. Let me rephrase, I guess.
Using a cache, how many pages per second should I expect from a web host? Should a readfile() call on a shared web host take less than half a second?
This probably won’t help, but I had a similar situation only with different software. Kept getting a warning email from my host about resources (over 600 members with 200 on the site at any given time). I finally got tired of it and purchased my own server. Did a cPanel to cPanel move over to my server, and after that I never once saw my resources skyrocket from that software. My host had kept saying it was the software, but it wasn’t… it was their server.
I can’t help but feel that in your situation it’s the same thing. Granted I have no proof… it’s just a feeling. Sorry I couldn’t help you though.
I received a notice from Dreamhost yesterday too! I use WP-Cache as well and don’t see how this all of a sudden became a problem on their server. Traffic across all my sites has increased less than 10% in the last month so how did my site all of a sudden jump way over their limit? Doesn’t make sense to me. Let me know what you find out ColdForged.
Eeeek. And I’m just about to set up shop at DH!
Sounds like the server’s disk isn’t keeping up. Could be other sites also hitting disk a lot, one site that’s pounding the disk but using low cpu cycles otherwise, etc.
Also note that the typical benchmarking times are raw timestamp start-to-finish, thus not necessarily representative of actual CPU usage (CPU shouldn’t be 100% committed to you during a 1s period!).
-d
It’s great they give us all the bandwidth and increase the allowance each month. But what good is it if we can’t use it because CPU usage is too high?
I’m using a little over 1/3 of my bandwidth, yet they tell me I used 60 minutes of CPU yesterday which is above their 30-40 limit.
I asked DreamHost if this problem just creeped up and they said it started on the 30th. I’ve disabled the new caching in the beta version to see if that solves the problem.
Have you found anything ColdForged?
Have you found anything ColdForged?
Not really. They asked for a minimal example, so I gave them one (http://www.coldforged.org/cachetest/cachetest.php) that simply performs readfile() on one of the files from my cache, bracketed with PEAR::Benchmark calls. Still takes at least half a second. We’ll see what they say.
After turning off caching I didn’t see any changes. DH’s “resources” output is useless. All it tells me is that php is doing all the work.
I’ve been “escalated to Level 2”.
You’d think that given the majority of sites should be non-dynamic, their file-access should be highly optimized. Sounds like the file cache isn’t big enough, the machine doesn’t have enough memory, and/or the drives aren’t well optimized for raw read speed. Dunno.
Still watching this thread, do let us know as things progress…
I’m still hovering around 55-60 minutes a day. I’ve asked them if they could tell me which files spiked in CPU usaged starting on the 30th. I’m pretty sure I did a SVN upgrade to 6 blogs at some point that day.
overselling does take it’s toll. time to find a VPS and/or dedicated server provider. π
Or, a shared hosting provider that doesn’t oversell its servers.
Well, they wrote me back and told me something was missing from their instructions to get the log to show actual file names. They want me to add
!/usr/local/bin/php
to every php script on my sites. I just had to chuckle when I read the email. I emailed them back and said, “You’re kidding right? I have hundreds of php scripts on each of my sites.”
That’s insane — first, in theory I’d think it’s only needed in the ‘root’ php files (like index.php, wp-rss2.php, etc.), that starts the php process. So it’s a more limited number. But, still silly as, Second, why can’t they update the global PHP binary to include whatever they need for logging?? Obviously, if they are overselling and running into performance issues, any tiny logging overhead would be offset by the INSTANT ability to see the ACTUAL php files that are the issue. HOWEVER, if only the root php files are tracked, it won’t tell much (I don’t know that’s the case — I know nothing of the internals of the PHP processor!).
Amazing we haven’t seen a DH guy here yet… last time a DH issue came up, they were pretty quick to jump in, given their ‘anointed’ status as a WP host.
-d