Hi guys, we maintain a news website that is about 2-3GB in file size. I was hoping with the new update that Duplicator would be able to handle the backup since it now does the chunking method.
Around 1GB the service timeouts and break the build. I was wondering if anyone else has had any success backing up large websites.
I love this plugin and find that it’s the best on the market that I have tried, and I have tried them all!
Let me know!
- The topic ‘Large website backup issue question’ is closed to new replies.