I should have added that the best way to cache a large WP site is probably using Varnish.
Still, caching and recaching your pages will take time and be database-intensive. The common practice that webmasters do is making changes to their site, clear the cache and wait for the cache to build over time. In reality it will never build up a large enough cache before the next time you want to make changes and clear the cache again. So, all too often, Google will be doing your recaching for you making the site slow for Google and faster-seeming for you. Showing Google an uncached page is no good. Google will record the page speed too slow and you'll never get any SEO traction.
If you're needing to build a cache faster before Google hits your pages, always crawl your site remotely before Google and Bing do. You can do this with a free crawler took such as Integrity Link Crawler for Mac and Link Sleuth for PC.
If your site is still slow, you'll need to re-engineer to make fewer and fewer hits to the database, smaller pages, more efficient pages and so on. If you really want to make a directory with millions of pages you're going to need a lot of help way beyond what caching can provide. That's very expensive to do a large directory. We spend over $100,000 per month on development. There are no short cuts to large directories.