Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Increase max upload filesize #5

Open
alexgleason opened this issue Feb 5, 2018 · 2 comments
Open

Increase max upload filesize #5

alexgleason opened this issue Feb 5, 2018 · 2 comments

Comments

@alexgleason
Copy link
Contributor

So I'm making some dummy files like so:

fallocate -l 100M dummy

-l means "length" which you can specify in human readable terms like 10M, 5G etc.

So far I tried uploading:

  • 10MB - pass
  • 20MB - pass
  • 100MB - fail

I'm not sure how large of files the TIER team needs to upload. I also need to investigate where it's coming from... ie Heroku, Django, gunicorn, or AWS.

@alexgleason
Copy link
Contributor Author

Okay, I figured out why this is happening. Heroku has no max upload size, but it has a 30 second timeout for all requests. In other words, if the request does not complete within 30 seconds, it will stop.

@alexgleason
Copy link
Contributor Author

There are 3 options for fixing this issue. They all require a lot of work:

  1. Migrate the site away from Heroku to a self-hosted platform.
  2. Modify Wagtail's code to make it directly upload the file. This would have to be approved by the Wagtail team, tested, and merged.
  3. Use a third-party desktop app to upload the large files directly to S3, and write a script to force Wagtail to update the documents section periodically. I think this is the worst option because it requires writing custom code anyway (we might as well do option 2 in that case) and it's not elegant at all.

Option 1 would work. I think Heroku is restrictive and moving away from it would open the path for limitless possibilities. It might also be cheaper. It would require someone monitoring it, but I think I could charge you to monitor it and it would still be cheaper than Heroku. There'd be less of a safety net though; if I got hit by a bus you'd need to find a new sysadmin.

Option 2 is appealing because it would theoretically fix this problem once and for all for Wagtail users on Heroku. However, it would take me longer to do, and I'm waiting to hear back from the Wagtail team about whether Heroku-specific code is something they want to add.

Option 3 is... possible. But after thinking it though, I think it's the worst option.


Of course, we could just forego this entirely and upload big files to Dropbox then put links to them in our pages. It's the safest workaround to this problem, but still comes with problems. Which fields are "big file" fields? Can we make that call about every field? What if there's an exception to the rule?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant