You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
My team uses cloudpathlib.S3Path objects in a script that runs at a large scale in cloud.
We encountered a throttling issue (too many read access to the same bucket), and I wondered if there's some retry mechanism for such cases (e.g., using exponential backoff, etc.).
Digging into the code, I couldn't find anything. I also found no related issues. Is this a known issue? Or do you handle it somehow?
thanks
p.s - an elegant solution could be using the retry package with its retry decorator
The text was updated successfully, but these errors were encountered:
Given that cloudpathlib is for connecting to other systems over networks, it makes sense that dealing with retries is relevant to using it. However, I'm not sure if it makes sense to bake it into cloudpathlib directly, vs. letting the user have the flexibility to handle it however they want. I'm open to discussion here though.
For example, you could wrap your cloudpathlib operations in a function and use any framework for retries to do the retries on that function.
Also, rather than the retry package, which looks like it hasn't been touched in 8 years, I recommend you check out either tenacity or stamina, which are fairly popular and also have easy-to-use decorators.
My team uses cloudpathlib.S3Path objects in a script that runs at a large scale in cloud.
We encountered a throttling issue (too many read access to the same bucket), and I wondered if there's some retry mechanism for such cases (e.g., using exponential backoff, etc.).
Digging into the code, I couldn't find anything. I also found no related issues. Is this a known issue? Or do you handle it somehow?
thanks
p.s - an elegant solution could be using the retry package with its
retry
decoratorThe text was updated successfully, but these errors were encountered: