You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It would be nice if burn supported some way to apply weight reparametrization to layers like found in pytorch (e.g. weight_norm, spectral_norm).
Feature motivation
pytorch docs provide a decent explanation and relevant links but in short, this can stabilize training of certain networks.
(Optional) Suggest a Solution
In the current design this kind of api probably belongs on Param since dynamically replacing fields, like pytorch does, is not really feasible. Param would have to switch out implementations based on a particular reparametrization.
The text was updated successfully, but these errors were encountered:
Feature description
It would be nice if burn supported some way to apply weight reparametrization to layers like found in pytorch (e.g. weight_norm, spectral_norm).
Feature motivation
pytorch docs provide a decent explanation and relevant links but in short, this can stabilize training of certain networks.
(Optional) Suggest a Solution
In the current design this kind of api probably belongs on
Param
since dynamically replacing fields, like pytorch does, is not really feasible.Param
would have to switch out implementations based on a particular reparametrization.The text was updated successfully, but these errors were encountered: