Support » Plugin: Duplicator - WordPress Migration Plugin » Large website backup issue question

  • Resolved Alex


    Hi guys, we maintain a news website that is about 2-3GB in file size. I was hoping with the new update that Duplicator would be able to handle the backup since it now does the chunking method.

    Around 1GB the service timeouts and break the build. I was wondering if anyone else has had any success backing up large websites.

    I love this plugin and find that it’s the best on the market that I have tried, and I have tried them all!

    Let me know!

Viewing 1 replies (of 1 total)
  • Plugin Author Cory Lamle


    Hey Atjuch,

    The 0.5 series doesn’t support chuncking unfortunately. There are still several large hurdles that will have to be ironed out before chucking is supported. However there are two new options for users on shared hosts that don’t have much control over their server environment. These new options will not work on hosts that kill the process no matter what, but they will help with cutting down on build times for large databases and servers that need a period buffer of data to keep the connection alive.

    1. On the settings page if your host allows mysqldump, you can build your database using this option which is very useful for large databases. MysqlDump for example on a 100MB database has about 100x more performance than PHP. The larger your database the more performance you will see with this setting.
    2. The second option is the ‘Archive Flush’. This allows for period buffer dumps for servers that don’t have a timeout limit, but do require a periodic flush of data in order to keep the connection alive and running.

    Also remember that simply excluding the larger files (i.e. large zip files, movie files etc.) will allow your package to complete in most cases on a deceit hosting provider. Then you can manually move over the larger files as needed.


Viewing 1 replies (of 1 total)
  • The topic ‘Large website backup issue question’ is closed to new replies.