|
mlpack
3.4.2
|
To represent the various types of loss functions encountered in machine learning problems, mlpack provides the FunctionType template parameter in the optimizer interface. The various optimizers available in the core library rely on this policy to gain the necessary information required by the optimizing algorithm.
The FunctionType template parameter required by the Optimizer class can have additional requirements imposed on it, depending on the type of optimizer used.
The most basic requirements for the FunctionType parameter are the implementations of two public member functions, with the following interface and semantics
Optimizers like SGD and RMSProp require a DecomposableFunctionType having the following requirements
ParallelSGD optimizer requires a SparseFunctionType interface. SparseFunctionType requires the gradient to be in a sparse matrix (arma::sp_mat), as ParallelSGD, implemented with the HOGWILD! scheme of unsynchronised updates, is expected to be relevant only in situations where the individual gradients are sparse. So, the interface requires function with the following signatures
The SCD optimizer requires a ResolvableFunctionType interface, to calculate partial gradients with respect to individual features. The optimizer requires the decision variable to be arranged in a particular fashion to allow for disjoint updates. The features should be arranged columnwise in the decision variable. For example, in SoftmaxRegressionFunction the decision variable has size numClasses x featureSize (+ 1 if an intercept also needs to be fit). Similarly, for LogisticRegression, the decision variable is a row vector, with the number of columns determined by the dimensionality of the dataset.
The interface expects the following member functions from the function class
1.8.5