-
Notifications
You must be signed in to change notification settings - Fork 53
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update PyTorch to 2.4.1 #268
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
This file was deleted.
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1 @@ | ||
# Extra dependencies for CPU-only |
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,2 +1,3 @@ | ||
# Extra dependencies for NVIDIA CUDA | ||
flash-attn>=2.4.0 | ||
bitsandbytes>=0.43.1 |
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,11 @@ | ||
# # Extra dependencies for Intel Gaudi / Habana Labs HPU devices | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I think Christian's PR merged and changed these things as well, I don't want to overwrite whatever he updated. Would it be possible to check which is the correct source of truth? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Let's keep |
||
# Habana Labs 1.17.1 has PyTorch 2.3.1a0+gitxxx pre-release | ||
torch>=2.3.1a0,<2.4.0 | ||
# Habana Labs frameworks | ||
habana-torch-plugin>=1.17.1 | ||
habana_gpu_migration>=1.17.1 | ||
# additional Habana Labs packages (installed, but not used) | ||
#habana-media-loader | ||
#habana-pyhlml | ||
#habana_quantization_toolkit | ||
#habana-torch-dataloader |
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1 @@ | ||
# Extra dependencies for Apple MPS (Metal Performance Shaders) |
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,2 @@ | ||
# Extra dependencies for AMD ROCm | ||
flash-attn>=2.6.2,<2.7.0 | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. What's the reason to limit the upper bound of |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should this min version here be bumped to 2.4.1?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
When Gaudi 1.18 is supported, yes.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Gaudi software 1.18.0 has 2.4.0a0. The 1.18.x series will not get a newer version.
Why are you changing the version range for Torch? I don't see any changes to Python code or other dependencies. There is no apparent reason why
instructlab-training
should no longer work with Torch 2.3.x or 2.5.x.I would prefer to keep the version ranges for dependencies of instructlab-training as open as possible and only restrict upper versions in
instructlab
package.