Optimizing Laravel Backup for Large Websites: Best Practices for Handling Heavy Loads #1749
Unanswered
iwasherefirst2
asked this question in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello Spatie/Laravel Backup Community,
I'm reaching out for advice on optimizing backups for a large website using your package. My site is hosted on a VPS with 250 GB of free space, and the site itself occupies about 10 GB with numerous files.
I've set up weekly backups to S3 storage using a cron job that archives everything before transferring. However, this process is extremely resource-intensive, taking about an hour and significantly slowing down my site during execution. Currently, the cron job is scheduled at 3:00 AM on Mondays, but due to my website's international audience, this timing isn't ideal.
Additionally, I frequently encounter backup failures with errors like: "The following parts had errors: - Part 27: Error executing 'UploadPart': AWS HTTP error: cURL error 56: OpenSSL SSL_read: Connection reset by peer, errno 104..."
This leads me to question my initial setup. Would it be more efficient to store my website's data directly on S3, thereby reducing the load on my VPS during backups? If so, should I consider using two separate S3 storage solutions from different providers for my live data and backups?
I'm looking for insights or best practices on handling backups for websites with large or numerous files. Any guidance or suggestions on how to streamline this process without overburdening my server would be greatly appreciated.
Thank you in advance for your help!
Beta Was this translation helpful? Give feedback.
All reactions