You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We can extend the automatic closed-form multivariate normal posterior optimization to the time-series case and compute condition posteriors for terms in DLMs/state-space models.
To start, this would involve all the known exponential family conjugates (e.g. gaussian DLMs) and their accompanying sampling methods (e.g. forward-backward).
Complete examples for implementations of both can be found in amimodels (written in PyMC2). There's also some high-level documentation here.
More specifically, the forward-backward steps are here, and the conjugate steps start here.
The text was updated successfully, but these errors were encountered:
I have a manually constructed Theano FFBS sampler for DLMs in this article, along with a non-trivial scale-mixture extension to non-Gaussian observations. This is exactly the kind of thing we want to automate (well, the combination of things).
Also, this could be easily ported to TensorFlow, but, since the graph optimizations are still too basic and cumbersome to develop, I don't plan on porting it any time soon.
We can extend the automatic closed-form multivariate normal posterior optimization to the time-series case and compute condition posteriors for terms in DLMs/state-space models.
To start, this would involve all the known exponential family conjugates (e.g. gaussian DLMs) and their accompanying sampling methods (e.g. forward-backward).
Complete examples for implementations of both can be found in
amimodels
(written in PyMC2). There's also some high-level documentation here.More specifically, the forward-backward steps are here, and the conjugate steps start here.
The text was updated successfully, but these errors were encountered: