Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update parameter.py #184

Open
wants to merge 1 commit into
base: dev
Choose a base branch
from
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
29 changes: 24 additions & 5 deletions whobpyt/datatypes/parameter.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,6 @@
Authors: Andrew Clappison, John Griffiths, Zheng Wang, Davide Momi, Sorenza Bastiaens, Parsa Oveisi, Kevin Kadak, Taha Morshedzadeh, Shreyas Harita
"""


import torch
import numpy
import numpy as np
Expand All @@ -25,13 +24,16 @@ class par:
Prior mean of the data value
prior_precision : Tensor
Prior inverse of variance of the value

fit_par: Bool
Whether the parameter value should be set to as a PyTorch Parameter
fit_hyper : Bool
Whether the parameter prior mean and prior variance should be set as a PyTorch Parameter
asLog : Bool
asLog: Bool
Whether the log of the parameter value will be stored instead of the parameter itself (will prevent parameter from being negative).
asRand : Bool
Whether the parameter is randomly sampled
lb : Tensor
Value of the parameter lower bound
device: torch.device
Whether to run on CPU or GPU
'''

def __init__(self, val, prior_mean = None, prior_std = None, fit_par = False, asLog = False, asRand = True, lb = 0, device = torch.device('cpu')):
Expand All @@ -47,8 +49,25 @@ def __init__(self, val, prior_mean = None, prior_std = None, fit_par = False, as
Prior std of the value
fit_par: Bool
Whether the parameter value should be set to as a PyTorch Parameter
fit_hyper : Bool
Whether the parameter prior mean and prior variance should be set as a PyTorch Parameter
device: torch.device
Whether to run on CPU or GPU
asLog: Bool
Whether the log of the parameter value will be stored instead of the parameter itself (will prevent parameter from being negative).
asRand : Bool
Whether the parameter is randomly sampled
lb : Tensor
Value of the parameter lower bound.

val_ts : Tensor
The parameter value as a tensor
prior_mean_ts : Tensor
Prior mean as a tensor
prior_std_ts : Tensor
Prior std as a tensore
prior_precision : Tensor
Prior inverse of variance of the value
'''
self.fit_par = fit_par
self.device = device
Expand Down
Loading