-
Notifications
You must be signed in to change notification settings - Fork 6
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fails to upload large files if password is supplied #14
Comments
Does the file appear in the app after that? You mentioned the file is encrypted, but is it encrypted on the server while not appearing in the app, or does it essentially work but just display a message? I’ll look into the problem tomorrow, as it’s almost 2 AM here. |
Thanks, man. I've been doing some further testing. It turns out a file of 5125M works but a file of 5150M fails! If I upload an 8-gig file with no password and I puts up a dialog saying that it failed. Nothing was listed in the View Files screen. The file is shown in the Admin Dashboard. Looking into the file drop area and comparing the file to the source file compares with no error. |
I tried reproducing the issue on both the newer version I’m working on and version 1.2.3 with a 10GB file and a 5GB file, everything seems to be working fine. It might be something specific to your setup or files. If you can, please send me the app logs. If you’re okay with it, you could also send me the files that are failing, or if they’re public, just share the link where I can download them. |
Here are the files I used. Note they will be automatically deleted in 7 days
Here's the log file |
Well, the log doesn’t show anything unusual. I tried using your files, and it’s still working fine for me. The last thing to check is the logs from the browser. Could it be that you don’t have enough space on the disk to save the file? I haven’t added a check for that yet, but I assume it would have triggered an error in the system logs. |
Are you running a Docker container? I'm not sure what "the logs from the browser" would be. stdout/stderr has nothing in it. I have 6TB, so I have enough space and as I said before, the file is uploaded completely. It's just that an error is reported. I ran it under Firefox and when the error happened I looked at the debugger and saw:
Setting a break point at the console.log(xhr.responseText) yields:
Looking at xhr.status I see a 504 - gateway timeout. Could this be because I'm doing this from my Synology using a reverse proxy? |
It might be something with the reverse proxy setup, but other Synology users seem to have it working without any extra configuration. I’m not super familiar with the Synology ecosystem, so I’m not sure what’s causing it in your case. I’ll leave this open and tag it so someone else who knows more might be able to help. You could also try again after the 1.3 update comes out—most of the code has been refactored, so if it’s an issue with the app, there’s a good chance it’s fixed. The new version also has a lot more logging to help with debugging stuff like this. If you’re up for it, you can pull the dev branch and test it right now. Just a heads-up, though: it’s not quite ready yet, and there are a few known bugs, including a problem with the passwords (if you put password "123" it saves is as "123,123"). |
It turns out that using QuickDrop without the reverse proxy works (i.e. http://localhost:8487). Could you point me to some of those Synology users that have it working? I suspect they are not using a reverse proxy at all. I came to QuickDrop because I had issues with NextCloud when uploading large files. Perhaps it's the same misconfiguration of the reverse proxy that is causing the problem with NextCloud too. I would love to learn what I'm doing wrong with the reverse proxy config. |
I can’t really point you to anyone specific, but I’ve talked with some people on Reddit when I made posts about the app. Regarding the reverse proxy, I looked into it, and there might be a file size limit. If that’s the issue, it’s not easy to change, but it can be done. Sorry for the late reply, I have been sick the last week. |
I found a client_max_body_size in I would think that using a reverse proxy would be a common way for people to use QuickDrop. If it cannot upload large files reliably then I'd think it's failing to work correctly. Perhaps some debugging statements around where the error is happening would help. As I said before, the file does get uploaded, it's just that the error is reported. |
I am using Nginx on my instance as well, so if it's that, it must be a configuration issue. Try the new 1.3.0 version, which uses chunked uploads. Instead of a single large upload, it breaks the file into 10 MB chunks. Maybe this will solve the problem. |
Just tried it. Got "Chunk upload failed. Please try again". Here's the tail end of the log file:
I notice that nothing is put into fies until the end. Also, the file that's deposited is complete:
BTW, it might be good to display the version number somewhere in the Web UI. |
It appears at the end because it uses the tmp folder for the chunks. There should have been few more lines of logs for the upload, it's strange that they are missing. Look in the console of the browser, it should have logged the error. And yeah, maybe I should put the version somewhere visible.
This is what should have been there in the logs as well |
Not sure if this is helpful but:
|
To reproduce:
QuickDrop running in a docker container on Synology
The text was updated successfully, but these errors were encountered: