Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make Conjugate Gradient Descent work on GPU #222

Open
paquiteau opened this issue Dec 4, 2024 · 1 comment
Open

Make Conjugate Gradient Descent work on GPU #222

paquiteau opened this issue Dec 4, 2024 · 1 comment
Labels
good first issue Good for newcomers perf

Comments

@paquiteau
Copy link
Member

Currently the cg method in extras/gradient.py has a with_numpy decoration, meaning that the data will do roundtrip to the CPU. It should be able to run directly on GPU.

Plan of action:

  • Use with_numpy_cupy instead
  • adapt the computation inside to use the correct array module.
@paquiteau paquiteau added good first issue Good for newcomers perf labels Dec 4, 2024
@mmlyj
Copy link

mmlyj commented Dec 7, 2024

Hi.The package of mirtorch have similar things and this link descibe a good way to calculate the gradient.These clues might help(https://github.com/guanhuaw/Bjork).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
good first issue Good for newcomers perf
Projects
None yet
Development

No branches or pull requests

2 participants