You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Now all processed files (especially images) are generated on each query.
It's not difficult if you have only couple of images to be rendered on the page. But this becomes a problem when you have a webpage which requires 20-30 images to be generated simultaneously. They are returned one after another, even on powerful VPN.
In my opinion, exfile should have a cache ability, like:
config:exfile,Exfile,cache: %{size: '10GB',path: '/tmp/exfile-other'# default is `/tmp/exfile-cache-#{env}`}
All processed images will be stored to path and will have the mark of their earlier usage assigned, and there also will be a supervisor which monitors size of the path folder. If it oversizes limit, it will destroy the oldest unused images in it.
Since files are fingerprinted and processing inputs are in the URL, the requests can be cached by a normal HTTP cache (nginx and/or any pull-based CDN like CloudFront, CloudFlare, Fastly, etc). I don't think it's in the scope of Exfile to cache outputs when a HTTP cache can do it equally as well.
Now all processed files (especially images) are generated on each query.
It's not difficult if you have only couple of images to be rendered on the page. But this becomes a problem when you have a webpage which requires 20-30 images to be generated simultaneously. They are returned one after another, even on powerful VPN.
In my opinion, exfile should have a cache ability, like:
All processed images will be stored to
path
and will have the mark of their earlier usage assigned, and there also will be a supervisor which monitors size of thepath
folder. If it oversizes limit, it will destroy the oldest unused images in it.@keichan34 WDYT?
The text was updated successfully, but these errors were encountered: