NeuralEngine
A Game Engine with embeded Machine Learning algorithms based on Gaussian Processes.
NeuralEngine::MachineLearning::LBFGSBSolver< Scalar, LSType > Class Template Reference

Limited-memory BFGS (L-BFGS or LM-BFGS). More...

#include <FgLBFGSBsolver.h>

Inheritance diagram for NeuralEngine::MachineLearning::LBFGSBSolver< Scalar, LSType >:
Collaboration diagram for NeuralEngine::MachineLearning::LBFGSBSolver< Scalar, LSType >:

Public Member Functions

 LBFGSBSolver (int numberOfVariables)
 Creates a new instance of the L-BFGS optimization algorithm. More...
 
 LBFGSBSolver (int numberOfVariables, std::function< Scalar(const af::array &, af::array &)> function)
 Creates a new instance of the L-BFGS optimization algorithm. More...
 
 LBFGSBSolver (NonlinearObjectiveFunction< Scalar > *function)
 Creates a new instance of the L-BFGS optimization algorithm. More...
 
void SetHistorySize (const int hs)
 
- Public Member Functions inherited from NeuralEngine::MachineLearning::BaseGradientOptimizationMethod< Scalar, MoreThuente >
Scalar GetTolerance ()
 Gets the relative difference threshold to be used as stopping criteria between two iterations. Default is 0 (iterate until convergence). More...
 
void SetTolerance (Scalar tolerance)
 Sets the relative difference threshold to be used as stopping criteria between two iterations. Default is 0 (iterate until convergence). More...
 
int GetMaxIterations ()
 Gets the maximum number of iterations to be performed during optimization. Default is 0 (iterate until convergence). More...
 
void SetMaxIterations (int iter)
 Sets the maximum number of iterations to be performed during optimization. Default is 0 (iterate until convergence). More...
 
int GetIterations ()
 Gets the number of iterations performed in the last call to IOptimizationMethod.Minimize(). More...
 
- Public Member Functions inherited from NeuralEngine::MachineLearning::BaseOptimizationMethod< Scalar >
virtual int GetNumberOfVariables ()
 Gets the number of variables (free parameters) in the optimization problem. More...
 
virtual af::array GetSolution ()
 Gets the current solution found, the values of the parameters which optimizes the function. More...
 
virtual void SetSolution (af::array &x)
 Sets the current solution found, the values of the parameters which optimizes the function. More...
 
virtual Scalar GetValue ()
 Gets the output of the function at the current Solution. More...
 
virtual bool Maximize (af::array &values, int *cycle=nullptr)
 Finds the maximum value of a function. The solution vector will be made available at the Solution property. More...
 
virtual bool Minimize (af::array &values, int *cycle=nullptr)
 Finds the minimum value of a function. The solution vector will be made available at the Solution property. More...
 
virtual bool Maximize (int *cycle=nullptr)
 Finds the maximum value of a function. The solution vector will be made available at the Solution property. More...
 
virtual bool Minimize (int *cycle=nullptr)
 Finds the minimum value of a function. The solution vector will be made available at the Solution property. More...
 
void Display (bool display)
 Set to display optimization information. More...
 
virtual int GetNumberOfVariables ()=0
 Gets the number of variables (free parameters) in the optimization problem. More...
 
virtual af::array GetSolution ()=0
 Gets the current solution found, the values of the parameters which optimizes the function. More...
 
virtual void SetSolution (af::array &x)=0
 Gets a solution. More...
 
virtual Scalar GetValue ()=0
 Gets the output of the function at the current Solution. More...
 
virtual bool Minimize (int *cycle=nullptr)=0
 Finds the minimum value of a function. The solution vector will be made available at the Solution property. More...
 
virtual bool Maximize (int *cycle=nullptr)=0
 Finds the maximum value of a function. The solution vector will be made available at the Solution property. More...
 

Protected Member Functions

virtual bool Optimize (int *cycle=nullptr) override
 Implements the actual optimization algorithm. This method should try to minimize the objective function. More...
 
std::vector< intSortIndexes (const std::vector< std::pair< int, Scalar > > &v)
 Sorts pairs (k,v) according v ascending. More...
 
void GetGeneralizedCauchyPoint (const af::array &x, const af::array &g, af::array &x_cauchy, af::array &c)
 Computation of the generalized Cauchy point. More...
 
Scalar FindAlpha (af::array &x_cp, af::array &du, std::vector< int > &FreeVariables)
 Finds alpha* = max{a : a <= 1 and l_i-xc_i <= a*d_i <= u_i-xc_i}. More...
 
void SubspaceMinimization (af::array &x_cauchy, af::array &x, af::array &c, af::array &g, af::array &SubspaceMin)
 Solving unbounded probelm. More...
 
Scalar GetOptimality (const af::array &x, const af::array &g)
 Gets an optimality. More...
 
- Protected Member Functions inherited from NeuralEngine::MachineLearning::BaseGradientOptimizationMethod< Scalar, MoreThuente >
 BaseGradientOptimizationMethod (int numberOfVariables)
 Initializes a new instance of the BaseGradientOptimizationMethod class. More...
 
 BaseGradientOptimizationMethod (int numberOfVariables, std::function< Scalar(const af::array &, af::array &)> function)
 Initializes a new instance of the BaseGradientOptimizationMethod class. More...
 
 BaseGradientOptimizationMethod (NonlinearObjectiveFunction< Scalar > *function)
 Initializes a new instance of the BaseGradientOptimizationMethod class. More...
 
void InitLinesearch ()
 Inits linesearch. More...
 
- Protected Member Functions inherited from NeuralEngine::MachineLearning::BaseOptimizationMethod< Scalar >
void SetValue (Scalar v)
 Sets the output of the function at the current Solution. More...
 
void SetNumberOfVariables (int n)
 Sets the number of variables (free parameters) in the optimization problem. More...
 
 BaseOptimizationMethod (int numberOfVariables)
 Initializes a new instance of the BaseOptimizationMethod class. More...
 
 BaseOptimizationMethod (int numberOfVariables, std::function< Scalar(const af::array &, af::array &)> function)
 Initializes a new instance of the BaseOptimizationMethod class. More...
 
 BaseOptimizationMethod (NonlinearObjectiveFunction< Scalar > *function)
 Initializes a new instance of the BaseOptimizationMethod class. More...
 
virtual bool Optimize (int *cycle=nullptr)=0
 Implements the actual optimization algorithm. This method should try to minimize the objective function. More...
 

Private Attributes

af::array W
 
af::array M
 
Scalar theta
 
int m_historySize
 

Additional Inherited Members

- Protected Attributes inherited from NeuralEngine::MachineLearning::BaseGradientOptimizationMethod< Scalar, MoreThuente >
int maxIterations
 
Scalar _tolerance
 
int iterations
 
ILineSearch< Scalar > * linesearch
 
- Protected Attributes inherited from NeuralEngine::MachineLearning::BaseOptimizationMethod< Scalar >
NonlinearObjectiveFunction< Scalar > * _function
 
af::array _x
 
bool _display
 
af::dtype m_dtype
 

Detailed Description

template<typename Scalar, LineSearchType LSType = MoreThuente>
class NeuralEngine::MachineLearning::LBFGSBSolver< Scalar, LSType >

Limited-memory BFGS (L-BFGS or LM-BFGS).


Limited-memory BFGS (L-BFGS or LM-BFGS) is an optimization algorithm in the family of quasi-Newton methods that approximates the Broyden–Fletcher–Goldfarb–Shanno (BFGS) algorithm using a limited amount of computer memory. It is a popular algorithm for parameter estimation in machine learning. The algorithm's target problem is to minimize $f ( x )$ $f(\mathbf {x} )$ over unconstrained values of the real-vector $\mathbf{x}$ where $f$ is a differentiable scalar function.

Like the original BFGS, L-BFGS uses an estimation to the inverse Hessian matrix to steer its search through variable space, but where BFGS stores a dense $n\times n$ approximation to the inverse Hessian (n being the number of variables in the problem), L-BFGS stores only a few vectors that represent the approximation implicitly. Due to its resulting linear memory requirement, the L-BFGS method is particularly well suited for optimization problems with a large number of variables. Instead of the inverse Hessian $\mathbf{H}_k$, L-BFGS maintains a history of the past m updates of the position $\mathbf{x}$ and gradient ∇f(x), where generally the history size $m$ can be small (often $m<10$). These updates are used to implicitly do operations requiring the $\mathbf{H}_k$-vector product.

References:

HmetalT, 02.05.2019.

Definition at line 76 of file FgLBFGSBsolver.h.

Constructor & Destructor Documentation

◆ LBFGSBSolver() [1/3]

template<typename Scalar , LineSearchType LSType = MoreThuente>
NeuralEngine::MachineLearning::LBFGSBSolver< Scalar, LSType >::LBFGSBSolver ( int  numberOfVariables)

Creates a new instance of the L-BFGS optimization algorithm.

Admin, 3/27/2017.

Parameters
numberOfVariablesThe number of free parameters in the optimization problem.

◆ LBFGSBSolver() [2/3]

template<typename Scalar , LineSearchType LSType = MoreThuente>
NeuralEngine::MachineLearning::LBFGSBSolver< Scalar, LSType >::LBFGSBSolver ( int  numberOfVariables,
std::function< Scalar(const af::array &, af::array &)>  function 
)

Creates a new instance of the L-BFGS optimization algorithm.

Admin, 3/27/2017.

Parameters
numberOfVariablesThe number of free parameters in the function to be optimized.
function[in,out] The function to be optimized.
gradient[in,out] The gradient of the function.

◆ LBFGSBSolver() [3/3]

template<typename Scalar , LineSearchType LSType = MoreThuente>
NeuralEngine::MachineLearning::LBFGSBSolver< Scalar, LSType >::LBFGSBSolver ( NonlinearObjectiveFunction< Scalar > *  function)

Creates a new instance of the L-BFGS optimization algorithm.

Admin, 3/27/2017.

Parameters
functionThe objective function and gradients whose optimum values should be found.

Member Function Documentation

◆ Optimize()

template<typename Scalar , LineSearchType LSType = MoreThuente>
virtual bool NeuralEngine::MachineLearning::LBFGSBSolver< Scalar, LSType >::Optimize ( int cycle = nullptr)
overrideprotectedvirtual

Implements the actual optimization algorithm. This method should try to minimize the objective function.

Hmetal T, 11.04.2017.

Returns
true if it succeeds, false if it fails.

Implements NeuralEngine::MachineLearning::BaseOptimizationMethod< Scalar >.

◆ SortIndexes()

template<typename Scalar , LineSearchType LSType = MoreThuente>
std::vector< int > NeuralEngine::MachineLearning::LBFGSBSolver< Scalar, LSType >::SortIndexes ( const std::vector< std::pair< int, Scalar > > &  v)
protected

Sorts pairs (k,v) according v ascending.

Hmetal T, 12/06/2019.

Parameters
vThe std::vector<std::pair<int,Scalar>> to process.
Returns
The sorted indexes.

◆ GetGeneralizedCauchyPoint()

template<typename Scalar , LineSearchType LSType = MoreThuente>
void NeuralEngine::MachineLearning::LBFGSBSolver< Scalar, LSType >::GetGeneralizedCauchyPoint ( const af::array &  x,
const af::array &  g,
af::array &  x_cauchy,
af::array &  c 
)
protected

Computation of the generalized Cauchy point.

Hmetal T, 12/06/2019.

Parameters
problemThe problem.
xThe af::array to process.
gThe af::array to process.
x_cauchy[in,out] The cauchy.
c[in,out] The af::array to process.

◆ FindAlpha()

template<typename Scalar , LineSearchType LSType = MoreThuente>
Scalar NeuralEngine::MachineLearning::LBFGSBSolver< Scalar, LSType >::FindAlpha ( af::array &  x_cp,
af::array &  du,
std::vector< int > &  FreeVariables 
)
protected

Finds alpha* = max{a : a <= 1 and l_i-xc_i <= a*d_i <= u_i-xc_i}.

Hmetal T, 12/06/2019.

Parameters
problemThe problem.
x_cp[in,out] The cp.
du[in,out] The du.
FreeVariables[in,out] The free variables.
Returns
The found alpha.

◆ SubspaceMinimization()

template<typename Scalar , LineSearchType LSType = MoreThuente>
void NeuralEngine::MachineLearning::LBFGSBSolver< Scalar, LSType >::SubspaceMinimization ( af::array &  x_cauchy,
af::array &  x,
af::array &  c,
af::array &  g,
af::array &  SubspaceMin 
)
protected

Solving unbounded probelm.

Hmetal T, 12/06/2019.

Parameters
problemThe problem.
x_cauchy[in,out] The cauchy.
x[in,out] The af::array to process.
c[in,out] The af::array to process.
g[in,out] The af::array to process.
SubspaceMin[in,out] The subspace minimum.

◆ GetOptimality()

template<typename Scalar , LineSearchType LSType = MoreThuente>
Scalar NeuralEngine::MachineLearning::LBFGSBSolver< Scalar, LSType >::GetOptimality ( const af::array &  x,
const af::array &  g 
)
protected

Gets an optimality.

Hmetal T, 12/06/2019.

Parameters
xThe af::array to process.
gThe af::array to process.
Returns
The optimality.

Member Data Documentation

◆ W

template<typename Scalar , LineSearchType LSType = MoreThuente>
af::array NeuralEngine::MachineLearning::LBFGSBSolver< Scalar, LSType >::W
private

Definition at line 196 of file FgLBFGSBsolver.h.

◆ M

template<typename Scalar , LineSearchType LSType = MoreThuente>
af::array NeuralEngine::MachineLearning::LBFGSBSolver< Scalar, LSType >::M
private

Definition at line 196 of file FgLBFGSBsolver.h.

◆ theta

template<typename Scalar , LineSearchType LSType = MoreThuente>
Scalar NeuralEngine::MachineLearning::LBFGSBSolver< Scalar, LSType >::theta
private

Definition at line 197 of file FgLBFGSBsolver.h.

◆ m_historySize

template<typename Scalar , LineSearchType LSType = MoreThuente>
int NeuralEngine::MachineLearning::LBFGSBSolver< Scalar, LSType >::m_historySize
private

Definition at line 198 of file FgLBFGSBsolver.h.


The documentation for this class was generated from the following file: