You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi thangbui,
Firstly, thanks a lot for sharing this code, it's great.
Does the repo contain code for a Deep GP latent variable model, or is this not fully implemented yet?
Unfortunately, there are still many models missing or not fully implemented. My goal originally was to have all of the following models + inference schemes under one package [the goal is not to have another GPflow or GPy, but something more research-y and more pedagogical]:
various GP models: GP regression/classification, deep GPs for regression/classification, GP/deep GP latent variable models, GP state space model, ...
inference methods using inducing points for all models above: variational methods (Titsias's style), (Power) EP, approximate Power EP aka Blackbox alpha,
different posterior approximations for deep GPs: one with explicit representations for the intermediate hidden variables (as in the original deep GP paper by Damianou and Lawrence) and one with somewhat implicit hidden representations (as in the Nested compression paper by Hensman et al., or in Bui et al (2016) and Salimbeni and Deisenroth (2017)),
different propagation techniques for deep and state space architectures: probabilistic backpropagation (moment matching), simple Monte Carlo, linearisation etc., similar to scented and unscented methods in the sys-id literature,
by using EP style representations, inference for distributed/online settings should be easily handled.
I've been busy with other projects but will try to get back to this hopefully very soon.
Related to #7, need to figure out a good initialisation scheme
The text was updated successfully, but these errors were encountered: