• Using, WordPress to create a research library, we have a lot of large uploads which we would like to have reside remotely in an Amazon Cloudfront directory.

    The W3 “total cache” plugin can be set up to mirror the local directory structure, but we want it as an alternative (to relieve the space on the local server) and to have the uploads reside in the root of the remote directory (e.g., docs.domain.org [they are currently all in the root of the wp-content/uploads folder, not in separate month folders).

    Another plugin, “Amazon s3 simple upload form”, works nicely but is separate from WordPress’s upload form in the post-editing window.

    OUR WISH: Add Amazon Cloudfront login info (and desired rights for uploaded files, i.e., World Read) to WordPress’s Media Settings for Uploading.

Viewing 10 replies - 1 through 10 (of 10 total)
  • Moderator Ipstenu (Mika Epstein)

    (@ipstenu)

    πŸ³οΈβ€πŸŒˆ Advisor and Activist

    This will probably never be included in core for myriad reasons. But there are a lot of plugins: http://wordpress.org/extend/plugins/tags/cloudfront

    Thread Starter ericr23

    (@ericr23)

    Thanks — I didn’t find the Tan Tan Amazon S3 plugin when I searched before. It almost does exactly what we want. It seamlessly uploads media in a post to an S3 directory instead of the local uploads folder.

    It does not, however, link to the Cloudfront access to that directory.

    And it reproduces the entire local path (blog/wp-content/uploads) rather than just put the files in the root of the specified S3 directory.

    The plugin is only at version 0.4, though.

    Thread Starter ericr23

    (@ericr23)

    In other words:

    Instead of S3bucket/blog/wp-contents/uploads/file, we would like the location to be S3bucket/file.

    And further, we’d like the link to be blg.domain.org/file, which is the CNAME for the Cloudfront for the S3bucket.

    Moderator Ipstenu (Mika Epstein)

    (@ipstenu)

    πŸ³οΈβ€πŸŒˆ Advisor and Activist

    That’s in your media settings.

    http://www.techtipsgeek.com/host-images-wordpress-blog-subdomain-better-speed/6897/

    Basically change Store uploads in this folder to /file/ and then Full URL path to files to http://blg.domain.org/file

    You may need Store uploads in this folder to be S3bucket/file/ or even /public_html/S3bucket/file/ but that’s all possible.

    Thread Starter ericr23

    (@ericr23)

    The hitch there is, I think, that you need to make the remote target directory world-writeable.

    Another hitch is that, I believe, uploading to Amazon S3 requires a unique protocol.

    Moderator Ipstenu (Mika Epstein)

    (@ipstenu)

    πŸ³οΈβ€πŸŒˆ Advisor and Activist

    You may be wandering into custom plugin territory :/

    Thread Starter ericr23

    (@ericr23)

    I altered one line in the Tan Tan Amazon S3 plugin file “class-plugin.php”:

    $prefix = substr($parts['path'], 1) .'/';

    to

    $prefix = '';

    This nullifies the url path fetched via wp_upload_dir(), i.e., “blogdir/wp-content/uploads” is changed to “”.

    I named a bucket, per instructions for virtual hosting, to doc.domain.org and added “doc.domain.org IN CNAME doc.domain.org.s3.amazonaws.com.” to the DNS record. And I set that bucket up in the plugin’s settings with virtual hosting checked.

    The resulting upload in a post is at “doc.domain.org/file.jpg”.

    Then to use the Cloudfront distribution, I added “docs IN CNAME oxoxox.cloudfront.net” to the DNS. Having done that, the original link doesn’t work, but all I have to do is change “doc” to “docs”, which in time I shall automate, maybe by further tinkering in this plugin.

    Moderator Ipstenu (Mika Epstein)

    (@ipstenu)

    πŸ³οΈβ€πŸŒˆ Advisor and Activist

    For your own safety, I suggest changing the version number (or name) on the plugin so that if it’s updated, you don’t accidentally upgrade and shoot yourself in the foot. Basically, fork the plugin πŸ™‚

    No, I’ve never done that.

    Thread Starter ericr23

    (@ericr23)

    Thanks — I always update manually from my hard drive where customized files are clearly indicated. (I’m also saved from anyone inadvertently automatically upgrading, because access is only by SFTP and private key.)

    Thread Starter ericr23

    (@ericr23)

    I named a bucket, per instructions for virtual hosting, to doc.domain.org and added “doc.domain.org IN CNAME doc.domain.org.s3.amazonaws.com.” to the DNS record. And I set that bucket up in the plugin’s settings with virtual hosting checked.

    The resulting upload in a post is at “doc.domain.org/file.jpg”.

    Then to use the Cloudfront distribution, I added “docs IN CNAME oxoxox.cloudfront.net” to the DNS. Having done that, the original link doesn’t work, but all I have to do is change “doc” to “docs”, which in time I shall automate, maybe by further tinkering in this plugin.

    Corrections:

    Add “doc IN CNAME doc.domain.org.s3.amazonaws.com.” to the DNS record. Then the link works. Then changing “doc” to “docs” is required only to use the cloudfront distribution.

Viewing 10 replies - 1 through 10 (of 10 total)

The topic ‘Cloudfront or other secure remote location for uploads’ is closed to new replies.