Error calling other activation function arguments? #2845
Unanswered
jmenville11
asked this question in
Q&A
Replies: 1 comment
-
Hi @jmenville11 , the argument is |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello,
In playing around with the 3D UNet Model, I see that sigmoid and softmax are written as booleans directly into the code, but there is also an option to call other activation functions. Can you point to the documentation that lists the arguments for these other activation codes?
I've been able to use "sigmoid=True", "softmax=True", and "other_act=torch.tanh", but not PReLU, Softmin, or LogSoftmax. For example:
Abbreviated Code:
import torch
import torch.nn as nn
etc.
Error:
File "ICH_UNet_Model_Experiment_Parameters.py", line 216, in
[ToTensor(), Activations(other_act=nn.LogSoftmax), AsDiscrete(threshold_values=True)]
TypeError: init() got an unexpected keyword argument 'other_act'
Beta Was this translation helpful? Give feedback.
All reactions