Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Repeated reads of some files causes suboptimal performance on slow or remote filesystems #179

Open
sourcefrog opened this issue Aug 13, 2022 · 0 comments

Comments

@sourcefrog
Copy link
Owner

sourcefrog commented Aug 13, 2022

To do: trace file IO, investigate any case where an archive file is avoidably read more than once, fix it.

Perhaps: emit a warning if the same file is read more than once.

#175 just fixed by @WolverinDEV fixes one pathological case of repeatedly re-reading files.

However, there are some other cases where it reads a small file repeatedly in a way that is cheap on a local filesystem (where it will be in cache) but might be very slow remotely. It's definitely worth fixing, and I think I have fixed some in the sftp branch, but there are probably more.

Originally posted by @sourcefrog in #177 (comment)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant