Viewing 7 replies - 1 through 7 (of 7 total)
  • Thread Starter Ryan Hellyer

    (@ryanhellyer)

    Deleted.

    I came up with dumb solution that would never work.

    Thread Starter Ryan Hellyer

    (@ryanhellyer)

    I’m wondering now if I need to find some way to push my files to Amazon S3 first. That way I can pull them from a separate location that can’t be interfered with by the domain mapping.

    Thread Starter Ryan Hellyer

    (@ryanhellyer)

    I may be at risk of look like an idiot for talking to myself, but in case someone else finds this and is stuck on the same issue …

    I had a Eureka moment over night. The simplest solution I can think of, is to rewrite the URLs on the pages to point to the blog.dir folder. The CDN would then pull everything from that one folder (perhaps modrewritten to remove the ugly “wp-content/blog.dir/” naming system). That should theoretically solve the problem.

    I also considered shunting stuff to Amazon S3 via S3cmd. But I was concerned about what would happen if Cloudfront cached a file before it found it’s way to S3, so haven’t bothered implementing that. There would presumably be a gap in time between S3cmd sync’d up with S3 and your live server, and if someone visited a page during that time, then it could cause image load failures.

    Moderator Ipstenu (Mika Epstein)

    (@ipstenu)

    🏳️‍🌈 Advisor and Activist

    I’m fairly sure most people are setting up a cloudfront per-site actually :/ Same reason caching gets set up per site. It’s one of those things that has to get customized per-site, like we would per separate site.

    Thread Starter Ryan Hellyer

    (@ryanhellyer)

    No need to do it per site if I can do it across all of them in a seamless fashion though.

    I’m still having issues trying to do it with the blog.dir folder directly as I suck at rewrites unfortunately (particularly in NGINX).

    Thread Starter Ryan Hellyer

    (@ryanhellyer)

    I seem to have it sorted now. I’m sync’ing with Amazon S3 via the S3CMD command line utility. I run a cron job once per minute which does the sync’ing, but I should be able to force it to sync every time a file is uploaded too via an action hook.

    I then point a CloudFront distribution at the S3 bucket, then I point a domain at that CloudFront distrubution.

    I then wrote a custom plugin to rewrite the URLs on the page, to point everything at the new CDN URL’s.

    The only thing missing is a redirect from the /files/ URLs to the fancy new URLs. I haven’t figured out how to do that yet, but it shouldn’t be too difficult. Just a matter of working out how rewrites work in NGINX.

    The final result can be seen at http://ryanhellyer.net/. I’m manually sync’ing up the wp-includes and wp-admin folders now, so there’s a couple of smileys and whatnot missing, but it’s otherwise fully functional. I also appear to have no browser-caching, presumably due to a failure in how I setup S3 I guess.

    For what it’s worth, the next release of W3TC supports network-wide policies for CDN. It’s worth checking out and has been in the beta for some time. Reach out if you’re interested.

Viewing 7 replies - 1 through 7 (of 7 total)
  • The topic ‘Domain mapping and pull CDN's’ is closed to new replies.