Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

s3-store: Part number increments by 2 causing 10 000 limit to be hit sooner, and wrong part numbers #688

Closed
2 tasks done
jesse-small opened this issue Dec 19, 2024 · 2 comments · Fixed by #689
Closed
2 tasks done
Labels

Comments

@jesse-small
Copy link

Initial checklist

  • I understand this is a bug report and questions should be posted in the Community Forum
  • I searched issues and couldn’t find anything (or linked relevant results below)

Steps to reproduce

  1. Upload a file of larger size such as 200GB
  2. Choose an appropriate part size (32MB)
  3. Enable logs so that part number is shown after each upload

Expected behavior

The upload should complete without issue as 32MB parts is sufficient for 200GB without hitting the 10000 part limit.

Actual behavior

Before we finish the upload we will get an error "InvalidArgument: Part number must be an integer between 1 and 10000, inclusive" from S3 because our part number is 10002.

Looking through the code I can see that the partNumber is actually incremented twice, which is why we hit the part number limit early, and why it complains about 10002.

First place we increment the part

const nextPartNumber = partNumber + 1

Then before we upload the chunk we increment the part number again

const partNumber = currentPartNumber + 1

@jesse-small jesse-small changed the title Part number increments by 2 causing 10 000 limit to be hit sooner, and wrong part numbers s3-store: Part number increments by 2 causing 10 000 limit to be hit sooner, and wrong part numbers Dec 19, 2024
@fenos
Copy link
Collaborator

fenos commented Dec 19, 2024

Hey @jesse-small glad you found this too!
I was investigating #686 and found the same issue! 😃

Made a PR that fixes this issue #689

@jesse-small
Copy link
Author

@fenos awesome! Yeah I guess not as many people are uploading massive files so they aren't hitting that limit as frequently, great to see a PR fixing the issue!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants