-
-
Notifications
You must be signed in to change notification settings - Fork 39
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] Duplicati docker container taking 13GB of space... #70
Comments
Thanks for opening your first issue here! Be sure to follow the relevant issue templates, or risk having this issue marked as invalid. |
You've likely configured a backup job to store files within the container that aren't getting removed. |
Thanks for the mega fast reply. You can see it has indeed backed everything up to the Cloud (all 500+GB) so I don't get why it would be storing files inside the container (except for the DB)? This is the only job I have: It should not be backing up anything locally at all. Everything should be sent to the cloud and that /source/ folder is another share outside of the container. It's my I just noticed that I need to change the backup retention. But that should be on the cloud host side. Not on the local side. Can you tell me where/how it's storing things inside the container? It only has max 13GB of space (out of my 25GB total limit). |
Temporary files maybe, as duplicate encrypts / zips the files before they get sent. |
Any way to remove these automatically so this doesn't happen? |
You would need to check duplicati documentation. It's only a guess mind, you need to look through the config to make sure there's no rogue location that it's dumping files inside the container. |
Because you probably set the chunk size to that. |
I didn't set any "chunk" size anywhere. The file storage size is meant to be 50MB on the cloud end. As for the rest of the settings, you saw them all. Everything else is just set to default settings. I still don't understand why it would need to create temporary 1.9GB files inside the container for anything. Other than the DB it shouldn't be storing anything in the container. It even has use of a /tmp/ directory outside of the container for all the encryption etc. Anyway. Thanks for helping me narrow down the issue. |
Ok it seems this happens when enabling the: enable-http-buffering option. It's supposed to potentially speed up performance but for whatever reason it starts making these 1.9GB files in the /tmp location inside the image. It's disabled by default though, so I am just not going to use it. No problems so far with it off. I have 3 jobs running to 2 different clouds and the largest file in the /tmp loation is 4.8MB. |
Is there an existing issue for this?
Current Behavior
Duplicati version: 2.0.7.1_beta_2023-05-25
I have looked for reasons for this but cannot find any.
After installing Duplicati and running a single backup to OneDrive Cloud (of about 576GB in size).
My container grew to 13GB in size:
I have a flat limit of 25GB for my docker images. Even with a bunch of other services, nothing takes even 10% as much space as Duplicati is.
Checking the container settings via Unraid, it should not be saving anything "into" the image that increases the size of the container. The dir for temp files is on my storage array (not inside the /appdata folder).
Due to the lack of free space, other containers I have are now failing to load.
Expected Behavior
The Duplicati container should be no more than a few hundred megabytes in size.
Steps To Reproduce
Environment
CPU architecture
x86-64
Docker creation
Container logs
The text was updated successfully, but these errors were encountered: