There are hacks that do each. I haven’t seen any that will do both. It shouldn’t take much to combine them if that is what you are asking (and might cut down on your database queries).
As for “bringing search engines to ones archive files”, they shouldn’t have any problem follow the links.
I have a top-posts thing that is somewhat similar — displays the first 3 top posts, then another 3 randomly from the top 30 or so. (Those numbers are all configurable…)
I’d have to think about how to do this efficiently. How far randomly into the past do you want to look? 100 posts good enough?
-d
CHAITGEAR
“How far randomly into the past do you want to look? 100 posts good enough?”
Actually, more would be better – I have a photo log and it would be nice if some of the older pictures would come up every now and again. Maybe one could define an “offset”, so that it would be possible to restrict the number of posts and at the same time make it possible to use the whole archive?
“As for “bringing search engines to ones archive files”, they shouldn’t have any problem follow the links.”
I hope so. I converted my photolog from MT a while ago and at the same time added a photo tips and tricks section (on the same level, with even fewer incoming links).
So far, Google has fully indexed the photo tips and given them a PageRank of “5”. It only indexed the first page of the photo log, however, (the other pages have not even been cached yet) and did not assign a PageRank to it.
Of course, this may change with the next update…
I’m using redirects producing (hopefully) spider-friendly addresses. In any case, it can take a while until search engines reach the “bottom layers” of a site – the more deep links there are, the easier this is.
The random post from the archives would essentially give search engines a new deep link each time they visit – and add variety for the humans of course!
Laurenz