You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If I have one big file with 10,000 files. Can rufus iterate (build the pipeline) assuming atomic operations are over the files withing the zip file (or alternatively do I need to unzip first and only then iterate).
The text was updated successfully, but these errors were encountered:
I did something similar. Use split task that scans the container file and creates an empty placeholder for every compressed file to extract, followed by atransform task that takes the name of a placeholders and actually does the unzipping. That will parallelize decompression.
But if that's all you want ruffus is overkill. concurrent.futures would be easier.
If I have one big file with 10,000 files. Can rufus iterate (build the pipeline) assuming atomic operations are over the files withing the zip file (or alternatively do I need to unzip first and only then iterate).
The text was updated successfully, but these errors were encountered: