Without W3 Total Cache installed, the average time of my web site home page is 4.4 sec per Pingdom. With W3 installed, it takes on average 5.4 sec to load. The difference in speed is also noticeable when surfing the web site with and without the plugin.
After taking steps to improve site speed to 4.4 seconds, I was looking to W3 to be the pudding on the cake since it would minify, compress, and cache.
All of the diagnostics appear to show the plugin is working. Also, the score on Google Page Speed improved from 92 to 95. (However, webpagetest showed no difference with a score of 89.) Both of these tests give me a very poor score for caching.
I’ve tried a number of variables, including using the default variables, but to no avail. I also tested another plugin (Quick Cache), which did show an improvement in speed but not scores since it doesn’t have built-in minify and compress like W3.
I’ve tried it on both a site that I use as a CMS and a site that is set up as a simple blog.
So, what else should I try doing to resolve the issue?
If it can’t be resolved, should I stick with W3 for the better seo scores because of minify and compress or should I uninstall W3 to provide a faster site?
The website is http://TourGuideTim.com
Did you turn on minification? As I recall it defaults to manual, which means you have to add in the files yourself, or switch to auto.
Seeing a similar effect. Deactivated W3TC in light of this morning’s warning from WordPress.org about the security problem. Page Speed score dropped from 93 to 91%. YSlow dropped from 80 to 78%. (This site generally scores lower because of an inefficiently coded theme, which for corporate/commercial reasons we’re prepared to live with.) Yet actual page loading has remained in the 2.7 – 3.5 sec. range. Second loading is typically below 2 sec, and on average, everything “feels” a tad snappier and more responsive without W3TC.
W3TC’s self-hosted CDN has consistently added 20% to loading times, in addition to dropping some media library images. And minify has never worked properly in auto or manual modes. We continue using the wp-minify plugin — which does work and is relatively easy to set up.
Not complaining. The developer has obviously put a lot of effort into W3TC, and it probably improves performance on many sites. But not necessarily every site. We too have put in quite a few hours fiddling and tweaking, with little or no pay-off. Bit of an eye opener.
Thanks for the replies.
Auto minify of html and css seem to work fine for me. When I look at the page source, I can see it has been minified. (BTW, I did try WP-minify before installing W3TC but it rendered pages without any styling. I didn’t take the time to figure out why since I knew it was also a feature in W3TC. I read a number of people who said they preferred to activate WP-minify and deactivate that feature in the W3.)
I’m not using a CDN so that wouldn’t be the reason for my slow down either.
Ditto the observations above. I completely removed the plug in, all .htaccess remnants, and anything that suggested a relationship within mysql. Also enabled plain old gzip in the wp directory and my blog is now faster than ever. Hosting on godaddy if that helps.
lheintzman – Very true 🙂 I find it works best on VPS or dedicated servers and not very well on Shared Hosts (which is what most people have).
Thanks documaker. I may try the same as I too am on GoDaddy.
However, in attempt to make sure I’m using W3TC to the fullest, I’m still tinkering around.
Based on Pingdom tests, I’m now running < 10% slower with the plugin installed after changing minify to manual from automatic on the general tab and then working to move most js files on the minify tab out of the header.
It’s still a bummer to put in all this work with W3TC and have it run slower than with the site un-minified, un-compressed, and un-cached. However, with the tinkering, the difference in speed when browsing the site with and without the plugin is no longer noticeable. I’m holding out hope that future plugin upgrades will help improve speed.
Based on various SEO tests, there are two areas with extremely low scores. If anyone has suggestions, I’d appreciate it.
1. Leverage Browser Caching of Static Assets: Despite using the default expiration date of 1 year for media, js, and css on W3TC, my fail reports list all those items, which are hosted on my site, as having No max age or expires.
2. Use HTML Compression – I have enabled compression in W3TC but most tests say pages do not appear to be compressed. SeoSiteCheckUp provided a more detailed explanation saying that I had ‘page compression’ but not ‘page-size compression’. I haven’t figured out the difference but I can say that on the Pingdom tests, the html load size is big (indicating to me it is not compressed) and causes by far the biggest delay in loading the page.
Leverage Browser Caching of Static Assets
THAT may be an issue with your server, and not W3TC. Mine was sending back weird results for a while and in the end, I had to manually force it in my .htaccess AFTER making some serious server tweaks.
Use HTML Compression
It looks compressed to me on spot checks, but … “page-size compression” is gzip, and again, your server may not be set up right.
Thanks for the insight Ipstenu. I guess the key is to reach the point where I can have a dedicated server.
I leave this thread open a little longer to see if there are any additional suggestions for those of us on shared servers.
Hi Documaker. I was giving another stab at searching for some answers to help make W3TC work for me when I came across your new post on the step-by-step process you took to remove W3TC and create your own gzip files. Thank You for taking the time to share the details!
For those that come across this post, you can see his steps and videos here: http://www.justoutsourcing.com/wp/2011/06/w3-total-cache-might-slow-down-wordpress/
Just to let everyone know, I’m giving WP Super Cache a try. Godaddy suddenly decided my gzip code was ‘incompatible.’ >:-[ Still a smoother ride without Total Cache, tho! 🙂
I’m testing out that concept (i.e. doin’ it manually) on my own servers right now cause … what’s the fun in life if you don’t experiment. 😀 Sitespeed remains in the high 80s/low 90s, and http://www.whatsmyip.org/http_compression/ reports my gzip is on.
Now to minify some more 😉
FWIW, I tossed this in my .htaccess to handle gzip as well as expires etc.
I can understand your frustration with a slow loading site and having problems with a plugin designed to make it faster. It looks like your W3 Total Cache settings are correct for your hosting environment as I’m getting a 96 score on Google Page Speed. With most shared hosting it’s best to try and optimize the enhanced disk cache and minify settings to get the best results. I wrote an article on how to set up W3 Total Cache on shared hosting that lays out most of the best practices.
GoDaddy is pretty well know for slow WordPress sites. Their slow disk speeds, shared networked database servers and over crowded servers are why they are so slow. See this thread for some history. I also ran a reverse IP check on your domain name and it revealed that you share your ip address with 1,700 other sites which probably means there are around 3 times that many sites being hosted on the same server.
It’s kind of hard to expect any decent performance on such crowded servers.
Thank you for the info and links c3mdigital! I’ve implemented your recommendations except for the CDN. There is a new feature since your posting on the Browser Cache page called “Prevent caching of objects after settings change”. I’m wondering if I should leave that unchecked.
Are there any shared hosting providers that you can recommend?
I assume a page speed score of 96 is pretty good. However, the Labs section on Google Webmaster Tools says my site is slower than 65% of sites because it takes 4 sec to load my page. There seems to be a little disconnect. Is it true 65% of sites load in less than 4 sec? I thought I was doing pretty good considering my Page Speed score.
For others looking to speed up their site, I suggest running tools.pingdom.com to see the order and time it takes to load each element on your page. For example, I found my header background and logo were a combined size of 360kb. So I figured out how to combine these into one image of 68kb, which has been a big help.
It’s natural to try wringing out the last ounce of performance, and there are a number of online services to help. For sure, they can point out weaknesses; they’re most useful when general patterns emerge or the advice is consistent from one to another. But they can also make you nuts.
Optimization is far from an empirical science. These are machine-driven tools. And sometimes, just a wee bit stupid. It’s not unusual to get conflicting, contradictory reports, based on theorectical “best practices” — which may or may not be appropriate for your content and audience — and can vary with the time of day, amount of traffic, how much or which area of your site was scanned, etc. etc. Even repeating the same test back-to-back, within seconds, can produce different results. You need to take it all with a grain of salt, remembering that standards are not always well-defined, or even publicly disclosed. If you use those same tools to evaluate high-profile, well-respected sites, you might be amazed at how poorly some of them score, yet they are unquestionably successful at attracting an audience.
SEO practices are a case in point. Few if any people outside Google know exactly how the rating algorithm works, only that it seems to be evolving. Google has said recently that loading speed is one criterion — possibly one of hundreds — and perhaps not the most important. In other words, you could spend significant hours shaving micro-seconds off your load times, only to find it has little impact on where your site is listed following a search with Google, Bing, Yahoo, etc. The prevailing wisdom is that content is still king. The goal is still to create a good experience for readers. And ultimately, humans can evaluate that experience a lot better than machines.
You’ll have to take Google’s word that 4 sec. is slower than 65% of sites. But don’t overlook the warning that the results depend on how many data points a robot happens to see during its visit. On our site, the Google Lab number varies from less than 1 sec to more than 12 — completely absent of any programming changes. (We’re a news organization, so text and images are always turning over, while format, page size, bandwidth demands, etc. tend to remain pretty constant.)
You might also pay attention to what others have said about differences in the hosting environmenet. Apache-Linux setups are not all created equal. Some either facilitate or prevent various enhancement techniques, and a good host is crucial to overall performance. In fact, picking the host might be the single most important decision you make. Not to slam GoDaddy — though I certainly wouldn’t be alone — but to get the results you have with this supplier, you are doing very well. Or you’ve been unusually lucky.
FWIW, took a look at tourguidetim.com. We’re a couple thousand miles from your server — important for a travel promoter — and your site looks fine. Clean, attractive layout, easy navigation, readable copy, and what seems like interesting, useful information. Don’t know how it stacks up as a marketing exercise or where it lands in the search engines, but if the material is on-point and can stand out against competition, you’re probably doing fairly well. When it comes to fine-tuning the code, I wouldn’t be surprised if you’re reaching a point of diminishing returns.
- The topic ‘Web sites about 20% slower with W3 Total Cache’ is closed to new replies.