Sparse Gaussian Process Regression (SGPR) with optimization through Power Expectation Propagation (PEP).
More...
#include <FgPEPSparseGPR.h>
Public Member Functions | |
SGPR (const af::array &Y, const af::array &X, int numInducing=20, LogLikType lType=LogLikType::Gaussian) | |
Constructor. More... | |
void | Inference (Scalar alpha=1.0, int numIter=10, bool parallelUpdate=false, Scalar decay=0.5) |
Inference. More... | |
![]() | |
SparseGPBaseModel (const af::array &Y, const af::array &X, int numInducing=200, LogLikType lType=LogLikType::Gaussian) | |
Constructor. More... | |
SparseGPBaseModel () | |
Default constructor. More... | |
virtual | ~SparseGPBaseModel () |
Destructor. More... | |
virtual void | PredictF (const af::array &testInputs, af::array &mf, af::array &vf) override |
Predict noise free functions values \(\mathbf{F}_*\). More... | |
virtual void | SampleY (const af::array inputs, int numSamples, af::array &outFunctions) override |
Generate function samples from posterior. More... | |
af::array | GetTrainingInputs () |
Gets training inputs X. More... | |
void | SetTrainingInputs (af::array &inputs) |
Gets training inputs X. More... | |
af::array | GetPseudoInputs () |
Gets pseudo inputs. More... | |
virtual bool | Init () override |
Initializes the model. More... | |
virtual int | GetNumParameters () override |
Gets number of parameters. More... | |
virtual void | SetParameters (const af::array ¶m) override |
Sets the parameters for each optimization iteration. More... | |
virtual af::array | GetParameters () override |
Gets the parameters for each optimization iteration. More... | |
virtual void | UpdateParameters () override |
Updates the parameters. More... | |
virtual void | FixKernelParameters (bool isfixed) |
Sets fixation for hyperparameters. More... | |
virtual void | FixInducing (bool isfixed) |
Set fixation for inducing inputs. More... | |
SparseGPBaseLayer< Scalar > * | GetGPLayer () |
Gets the gp layer. More... | |
![]() | |
GPBaseModel (const af::array &Y, LogLikType lType=LogLikType::Gaussian, ModelType mtype=ModelType::GPR) | |
Constructor. More... | |
GPBaseModel () | |
Default Constructor. More... | |
virtual | ~GPBaseModel () |
Destructor. More... | |
virtual void | Optimise (OptimizerType method=L_BFGS, Scalar tol=0.0, bool reinit_hypers=true, int maxiter=1000, int mb_size=0, LineSearchType lsType=MoreThuente, bool disp=true, int *cycle=nullptr) |
Optimizes the model parameters for best fit. More... | |
virtual bool | Init () |
Initializes the model. More... | |
virtual void | PredictF (const af::array &testInputs, af::array &mf, af::array &vf) |
Predict noise free functions values \(\mathbf{F}_*\). More... | |
virtual void | PredictY (const af::array &testInputs, af::array &my, af::array &vy) |
Prediction of test outputs \(\mathbf{Y}_*\). More... | |
virtual void | SampleY (const af::array inputs, int numSamples, af::array &outFunctions) |
Generate function samples from posterior. More... | |
virtual void | AddData (const af::array Ytrain) |
Adds training data to the model. More... | |
af::array | GetTrainingData () |
Gets the training data set Y. More... | |
void | SetTrainingData (af::array &data) |
Sets training data Y. More... | |
virtual int | GetNumParameters () |
Gets number of parameters. More... | |
virtual void | SetParameters (const af::array ¶m) |
Sets the parameters for each optimization iteration. More... | |
virtual af::array | GetParameters () |
Gets the parameters for each optimization iteration. More... | |
virtual void | UpdateParameters () |
Updates the parameters. More... | |
virtual void | FixLikelihoodParameters (bool isfixed) |
Sets the likelihood parameters to be fixed or not for optimization. More... | |
void | SetSegments (af::array segments) |
Sets fixation for hyperparameters. More... | |
af::array | GetSegments () |
Gets the start index array for the sequences. More... | |
![]() | |
virtual Scalar | Function (const af::array &x, af::array &outGradient) |
Cost function the given x inputs. More... | |
virtual int | GetNumParameters ()=0 |
Gets number of parameters to be optimized. More... | |
virtual void | SetParameters (const af::array ¶m)=0 |
Sets the parameters for each optimization iteration. More... | |
virtual af::array | GetParameters ()=0 |
Gets the parameters for each optimization iteration. More... | |
virtual void | UpdateParameters ()=0 |
Updates the parameters. More... | |
int | GetDataLenght () |
Gets data lenght. More... | |
int | GetDataDimensionality () |
Gets data dimensionality. More... | |
ModelType | GetModelType () |
Gets model type. More... | |
virtual void | SetBatchSize (int size) |
Sets batch size. More... | |
int | GetBatchSize () |
Gets batch size. More... | |
void | SetIndexes (af::array &indexes) |
Sets the batch indexes. More... | |
Private Member Functions | |
template<class Archive > | |
void | serialize (Archive &ar, unsigned int version) |
Friends | |
class | boost::serialization::access |
Additional Inherited Members | |
![]() | |
IModel (int numData, int numDimension, ModelType type) | |
Constructor. More... | |
![]() | |
int | ik |
number of inducing inputs More... | |
int | iq |
latent dimension More... | |
af::array | afX |
training inputs More... | |
SparseGPBaseLayer< Scalar > * | gpLayer |
gp layer More... | |
![]() | |
bool | bInit |
check if model is initialized More... | |
af::array | afY |
training dataset, mean substracted More... | |
af::array | afBias |
the bias More... | |
af::array | afSegments |
Index of starting positions for all trials. More... | |
LikelihoodBaseLayer< Scalar > * | likLayer |
liklihood layer More... | |
![]() | |
ModelType | mType |
int | iN |
dataset length More... | |
int | iD |
dataset dimension More... | |
int | iBatchSize |
size of the batch More... | |
af::array | afIndexes |
indexes of /f$\mathbf{X}/f$ for batch learning More... | |
af::dtype | m_dType |
floating point precision flag for af::array More... | |
Sparse Gaussian Process Regression (SGPR) with optimization through Power Expectation Propagation (PEP).
PEP, as an extention of EP, minimizes a α-divergence. It is equivalent to minimizing KL-divergence with the exact distribution raised to a power. PEP can be seen as a hybrid between the regular EP (α = 1) and variational inference (Variational Free Energy (VFE)) (α = 0).
For more information see, https://www.microsoft.com/en-us/research/wp-content/uploads/2016/02/tr-2004-149.pdf https://arxiv.org/pdf/1605.07066.pdf
, 28.02.2018.
Definition at line 44 of file FgPEPSparseGPR.h.
NeuralEngine::MachineLearning::GPModels::PowerEP::SGPR< Scalar >::SGPR | ( | const af::array & | Y, |
const af::array & | X, | ||
int | numInducing = 20 , |
||
LogLikType | lType = LogLikType::Gaussian |
||
) |
Constructor.
, 12.06.2018.
Y | The training data. |
X | The training inputs. |
numInducing | (Optional) number of inducing inputs. |
lType | (Optional) likelihood type. |
void NeuralEngine::MachineLearning::GPModels::PowerEP::SGPR< Scalar >::Inference | ( | Scalar | alpha = 1.0 , |
int | numIter = 10 , |
||
bool | parallelUpdate = false , |
||
Scalar | decay = 0.5 |
||
) |
Inference.
Running the PowerEP algorithm to optimize posterior.
, 12.06.2018.
alpha | (Optional) the alpha. |
numIter | (Optional) number of iterators. |
parallelUpdate | (Optional) true to parallel update. |
decay | (Optional) the decay. |
|
inlineprivate |
Definition at line 83 of file FgPEPSparseGPR.h.
|
friend |
Definition at line 80 of file FgPEPSparseGPR.h.