Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ImportError: cannot import name 'params_t' from 'torch.optim.optimizer' #210

Open
seansica opened this issue Jul 7, 2024 · 2 comments
Open

Comments

@seansica
Copy link

seansica commented Jul 7, 2024

Ran this from the demo code:

import os

# Check if we're in Colab
try:
    import google.colab  # noqa: F401 # type: ignore

    in_colab = True
except ImportError:
    in_colab = False

#  Install if in Colab
if in_colab:
    %pip install sparse_autoencoder transformer_lens transformers wandb

# Otherwise enable hot reloading in dev mode
if not in_colab:
    %load_ext autoreload
    %autoreload 2

from sparse_autoencoder import (
    ActivationResamplerHyperparameters,
    AutoencoderHyperparameters,
    Hyperparameters,
    LossHyperparameters,
    Method,
    OptimizerHyperparameters,
    Parameter,
    PipelineHyperparameters,
    SourceDataHyperparameters,
    SourceModelHyperparameters,
    SweepConfig,
    sweep,
)

Throws:

---------------------------------------------------------------------------
ImportError                               Traceback (most recent call last)
[<ipython-input-3-ef2a52074e18>](https://localhost:8080/#) in <cell line: 3>()
      1 import os
      2 
----> 3 from sparse_autoencoder import (
      4     ActivationResamplerHyperparameters,
      5     AutoencoderHyperparameters,

1 frames
[/usr/local/lib/python3.10/dist-packages/sparse_autoencoder/optimizer/adam_with_reset.py](https://localhost:8080/#) in <module>
     10 from torch.nn.parameter import Parameter
     11 from torch.optim import Adam
---> 12 from torch.optim.optimizer import params_t
     13 
     14 from sparse_autoencoder.optimizer.abstract_optimizer import AbstractOptimizerWithReset

ImportError: cannot import name 'params_t' from 'torch.optim.optimizer' (/usr/local/lib/python3.10/dist-packages/torch/optim/optimizer.py)

---------------------------------------------------------------------------
NOTE: If your import is failing due to a missing package, you can
manually install dependencies using either !pip or !apt.

To view examples of installing some common dependencies, click the
"Open Examples" button below.
---------------------------------------------------------------------------

This occurs in Colab.

> python --version
Python 3.10.12

> pip list | grep -e torch -e sparse_autoencoder -e transformer_lens -e transformers -e wandb
sparse_autoencoder               1.10.0
torch                            2.3.0+cu121
torchaudio                       2.3.0+cu121
torchsummary                     1.5.1
torchtext                        0.18.0
torchvision                      0.18.0+cu121
transformers                     4.41.2
wandb                            0.17.4
@seansica
Copy link
Author

seansica commented Jul 7, 2024

> import torch.optim.optimizer as optim
> optim.__dir__()

['__name__',
 '__doc__',
 '__package__',
 '__loader__',
 '__spec__',
 '__file__',
 '__cached__',
 '__builtins__',
 '__annotations__',
 'math',
 'functools',
 'warnings',
 'OrderedDict',
 'defaultdict',
 'deepcopy',
 'chain',
 'Any',
 'Callable',
 'DefaultDict',
 'Dict',
 'Hashable',
 'Iterable',
 'List',
 'Optional',
 'Set',
 'Tuple',
 'TypeVar',
 'Union',
 'cast',
 'overload',
 'ParamSpec',
 'Self',
 'TypeAlias',
 'torch',
 'hooks',
 'RemovableHandle',
 'Indices',
 'TensorListList',
 '_get_foreach_kernels_supported_devices',
 '_get_fused_kernels_supported_devices',
 'is_compiling',
 '_group_tensors_by_device_and_dtype',
 'Args',
 'Kwargs',
 'StateDict',
 'GlobalOptimizerPreHook',
 'GlobalOptimizerPostHook',
 '__all__',
 '_global_optimizer_pre_hooks',
 '_global_optimizer_post_hooks',
 '_foreach_supported_types',
 '_RequiredParameter',
 'required',
 '_use_grad_for_differentiable',
 '_get_value',
 '_stack_if_compiling',
 '_dispatch_sqrt',
 '_default_to_fused_or_foreach',
 '_view_as_real',
 '_get_scalar_dtype',
 '_foreach_doc',
 '_fused_doc',
 '_capturable_doc',
 '_differentiable_doc',
 '_maximize_doc',
 'register_optimizer_step_pre_hook',
 'register_optimizer_step_post_hook',
 'ParamsT',
 '_P',
 'R',
 'T',
 'Optimizer']

Could be that params_t needs to be changed to ParamsT

@seansica
Copy link
Author

seansica commented Jul 7, 2024

Also just noted that the issue does not occur when building from source (poetry install). I think that suggests there is a discrepancy with the pip install, maybe an incorrectly specified dependency in [tool.poetry.dependencies] or something like that.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant