[Plugin: BackWPup] Amazon S3 problem solution
-
Hello Daniel,
First of all thank you for your continued development of this plugin. It is perhaps the most comprehensive free backup plugin.
The problem:
———–At some point in the past the integration with Amazon S3 broke (for me). I think the problem started with version 1.7.8 (S3 sdk v 1.3.5) (but I am not sure about the version). And I can reproduce this problem in the current version (v 2.1.4).
I traced the problem down. It seems when the file size if above a certain size (25 MB).
The symptom of the problem is the says the following and remains stuck there:
“Uploading to Amazon S3”The solution:
———–My solution to the problem has been to increase the “chunk size” to be greater than the size of my backup file. I increase the chunk size from 25MB to 60MB.
Basically I prevent the amazon uploader from trying to split the backup file into multiple chunks for upload. As long as I can force the upload to happen in one chunk the upload is successful.It would be awesome if you can find a proper fix to this.
Pending a fix, it would be nice to have a setting for configuring the amazon s3 chunk size.These are the two lines I change to make it work for me
File: backwpup/job/dest_s3.php
< need_free_memory(26214400*1.1);
> need_free_memory(62914560*1.1);< $result=$s3->create_mpu_object($STATIC[‘JOB’][‘awsBucket’], $STATIC[‘JOB’][‘awsdir’].$STATIC[‘backupfile’], array(‘fileUpload’ => $STATIC[‘JOB’][‘backupdir’].$STATIC[‘backupfile’],’acl’ => AmazonS3::ACL_PRIVATE,’storage’ => $storage,’partSize’=>26214400,’curlopts’=>$curlops));
> $result=$s3->create_mpu_object($STATIC[‘JOB’][‘awsBucket’], $STATIC[‘JOB’][‘awsdir’].$STATIC[‘backupfile’], array(‘fileUpload’ => $STATIC[‘JOB’][‘backupdir’].$STATIC[‘backupfile’],’acl’ => AmazonS3::ACL_PRIVATE,’storage’ => $storage,’partSize’=>62914560,’curlopts’=>$curlops));Regards
- The topic ‘[Plugin: BackWPup] Amazon S3 problem solution’ is closed to new replies.