Hyperparameter Optimization without Auto class. #687
Unanswered
philip-ndikum
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Given the diversity of hardware and software it may be prudent to have an option for users to select the Pytorch Lightning an Ray settings, so instead of working with wrapper classes, the end-user can define the low level hyperparameter optimization & define GPU & CPU classes to work with in Pytorch trainer instead of having this done using the
auto
class.Can this example be added to the core code base?
Beta Was this translation helpful? Give feedback.
All reactions