Skip to content

Commit

Permalink
Update readme
Browse files Browse the repository at this point in the history
  • Loading branch information
jloveric committed Jun 6, 2024
1 parent 461c064 commit 3c8b8e7
Showing 1 changed file with 4 additions and 1 deletion.
5 changes: 4 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -120,7 +120,10 @@ All polynomials are Lagrange polynomials with Chebyshev interpolation points.
|fourier(1d,2d) | fourier series convolution

## Initializing of layers
For non convolutional layers I've found that initializing the polynomials to continuous line across all segments, works better then a random wiggly polynomial. I don't have similar functions implemented for convolutional layers. Here is a function that does this initialization (it can be found in [networks.py](https://github.com/jloveric/high-order-layers-torch/blob/master/high_order_layers_torch/networks.py))
The default initialization is to initialize each link to a random constant, i.e. all weights have the same value in a link. This seems to work
pretty well, however, I also have linear random linear initialization (non constant). The implementation of the linear initialization is slower and I'm not sure it's actually better.

Here is a function that does this linear initialization for non convolutional layers (it can be found in [networks.py](https://github.com/jloveric/high-order-layers-torch/blob/master/high_order_layers_torch/networks.py))
```
def initialize_network_polynomial_layers(
network: nn.Module,
Expand Down

0 comments on commit 3c8b8e7

Please sign in to comment.