It's natural to try wringing out the last ounce of performance, and there are a number of online services to help. For sure, they can point out weaknesses; they're most useful when general patterns emerge or the advice is consistent from one to another. But they can also make you nuts.
Optimization is far from an empirical science. These are machine-driven tools. And sometimes, just a wee bit stupid. It's not unusual to get conflicting, contradictory reports, based on theorectical "best practices" -- which may or may not be appropriate for your content and audience -- and can vary with the time of day, amount of traffic, how much or which area of your site was scanned, etc. etc. Even repeating the same test back-to-back, within seconds, can produce different results. You need to take it all with a grain of salt, remembering that standards are not always well-defined, or even publicly disclosed. If you use those same tools to evaluate high-profile, well-respected sites, you might be amazed at how poorly some of them score, yet they are unquestionably successful at attracting an audience.
SEO practices are a case in point. Few if any people outside Google know exactly how the rating algorithm works, only that it seems to be evolving. Google has said recently that loading speed is one criterion -- possibly one of hundreds -- and perhaps not the most important. In other words, you could spend significant hours shaving micro-seconds off your load times, only to find it has little impact on where your site is listed following a search with Google, Bing, Yahoo, etc. The prevailing wisdom is that content is still king. The goal is still to create a good experience for readers. And ultimately, humans can evaluate that experience a lot better than machines.
You'll have to take Google's word that 4 sec. is slower than 65% of sites. But don't overlook the warning that the results depend on how many data points a robot happens to see during its visit. On our site, the Google Lab number varies from less than 1 sec to more than 12 -- completely absent of any programming changes. (We're a news organization, so text and images are always turning over, while format, page size, bandwidth demands, etc. tend to remain pretty constant.)
You might also pay attention to what others have said about differences in the hosting environmenet. Apache-Linux setups are not all created equal. Some either facilitate or prevent various enhancement techniques, and a good host is crucial to overall performance. In fact, picking the host might be the single most important decision you make. Not to slam GoDaddy -- though I certainly wouldn't be alone -- but to get the results you have with this supplier, you are doing very well. Or you've been unusually lucky.
FWIW, took a look at tourguidetim.com. We're a couple thousand miles from your server -- important for a travel promoter -- and your site looks fine. Clean, attractive layout, easy navigation, readable copy, and what seems like interesting, useful information. Don't know how it stacks up as a marketing exercise or where it lands in the search engines, but if the material is on-point and can stand out against competition, you're probably doing fairly well. When it comes to fine-tuning the code, I wouldn't be surprised if you're reaching a point of diminishing returns.