-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
#15 and #4 is addressed here #25
Conversation
New Flux Update ## v0.14.0 (July 2023) * Flux now requires julia v1.9 or later. * CUDA.jl is not a hard dependency anymore. Support is now provided through the extension mechanism, by loading `using Flux, CUDA`. The package cuDNN.jl also needs to be installed in the environment. (You will get instructions if this is missing.) * After a deprecations cycle, the macro `@epochs` and the functions `Flux.stop`, `Flux.skip`, `Flux.zeros`, `Flux.ones` have been removed.
It will error out since we needed to use Julia 1.9 for Flux |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @farhadrclass ! Thank you for the PR.
I would strongly recommend to open focused PR, in other words, more PRs but each with a simple focus. This way it is easier to review and approve :).
A start would be to open a first PR to update to Julia 1.9. Can you also explain a bit further why we drop 1.6?
Co-authored-by: tmigot <[email protected]>
Hi @tmigot Thank you for the review. The reason for combining the 4 and 15 bugs are that they are the same thing, one is asking to have unit test for different precision and the other one is mentioning that Float16 is not supported. The reason we needed Julia 1.9+ is that we need Flux 13 and or 14th since they added support for changing parameters type on the run time and they require Julia 1.9 (Also Float16 is hardware supported :) ) Lastly I had a question on how to fix the errors we are getting for CI and Documentation ? |
Thanks for the additional explanations. |
The documentation fails because you drop support of Flux 0.13 while it is the version used for the doc https://github.com/JuliaSmoothOptimizers/FluxNLPModels.jl/blob/main/docs/Project.toml |
This reverts commit a6693be.
@tmigot I rebased it and now waiting for it to run the CI Let me know if anything |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @farhadrclass for the PR! I made a couple of comments.
Also, Please use git squash to maintain a clean history of commits in your branch.
5a51cbf
to
20b3c7a
Compare
Update Project.toml Update test/runtests.jl Update src/utils.jl Update src/FluxNLPModels_methods.jl Update src/FluxNLPModels_methods.jl Update example/MNIST_cnn.jl updated based on Tangi's review multiple dispatch added p1 Update runtests.jl fixed an error in the code Co-Authored-By: tmigot <[email protected]>
@tmigot I added some multiple dispatch but not sure about all of them, what do you think |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @farhadrclass for the changes, I added some new comments.
added U for gradient since the gradient can be different than w and nlp.w type
@tmigot I added your suggestions, except the multip-dispatch which I explained. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I made 2 comments, so I approve once they are done. Can you also create an issue for the dispatch thing?
@farhadrclass please use ''squash and merge'' button so that we try to keep a clean history of commit (hère 18commits have been merged while it is only a small change) |
do you want me to do it now or in the future PR? |
I do not know how to do it after you merge btw @tmigot |
This is done :). It is better to "squash and merge" to keep a clean history |
I had also updated the Flux and therefore some updates are due to that