Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

New torch version breaks optimizer code #202

Open
chefexperte opened this issue Feb 23, 2024 · 3 comments
Open

New torch version breaks optimizer code #202

chefexperte opened this issue Feb 23, 2024 · 3 comments

Comments

@chefexperte
Copy link

Since the release of torch 2.2.0, there is no "params_t" anymore, meaning you get the following error, since the requirements of the pyproject.toml are torch=">=2.1.2".

[/usr/local/lib/python3.10/dist-packages/sparse_autoencoder/optimizer/adam_with_reset.py](https://localhost:8080/#) in <module>
     10 from torch.nn.parameter import Parameter
     11 from torch.optim import Adam
---> 12 from torch.optim.optimizer import params_t
     13 
     14 from sparse_autoencoder.optimizer.abstract_optimizer import AbstractOptimizerWithReset

ImportError: cannot import name 'params_t' from 'torch.optim.optimizer' (/usr/local/lib/python3.10/dist-packages/torch/optim/optimizer.py)
@Final-Industrialist
Copy link

I'm sure a patch is coming for this real soon, but if you just wanna run demo.ipynb you can just add the line:

%pip install torch=2.1.2

To the installation cell and everything should run smoothly (at least in colab).

@ehzawad
Copy link

ehzawad commented May 25, 2024

@Final-Industrialist thanks! %pip install torch==2.1.2 works
Screenshot 2024-05-26 at 12 12 21 AM Looks like new PyTorch version has broken things up. I guess the author could put the exact package versions, rather than putting range of the versions.

@EMZEDI
Copy link

EMZEDI commented Jun 27, 2024

Same issue here with the new versions of Torch.
A workaround could be not setting the params_t variable to have the value from torch.optim. You could simply set it to None and things temporarily work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants