The Kullback–Leibler divergence is often used for continuous distributions (direct regression).
More...
|
| | KLDivergence (const bool takeMean=false) |
| | Create the Kullback–Leibler Divergence object with the specified parameters. More...
|
| |
| template<typename InputType , typename TargetType , typename OutputType > |
| void | Backward (const InputType &input, const TargetType &target, OutputType &output) |
| | Ordinary feed backward pass of a neural network. More...
|
| |
| template<typename InputType , typename TargetType > |
| InputType::elem_type | Forward (const InputType &input, const TargetType &target) |
| | Computes the Kullback–Leibler divergence error function. More...
|
| |
| OutputDataType & | OutputParameter () const |
| | Get the output parameter. More...
|
| |
| OutputDataType & | OutputParameter () |
| | Modify the output parameter. More...
|
| |
| template<typename Archive > |
| void | serialize (Archive &ar, const unsigned int) |
| | Serialize the loss function. More...
|
| |
| bool | TakeMean () const |
| | Get the value of takeMean. More...
|
| |
| bool & | TakeMean () |
| | Modify the value of takeMean. More...
|
| |
template<typename InputDataType = arma::mat, typename OutputDataType = arma::mat>
class mlpack::ann::KLDivergence< InputDataType, OutputDataType >
The Kullback–Leibler divergence is often used for continuous distributions (direct regression).
For more information, see the following paper.
article{Kullback1951,
title = {On Information and Sufficiency},
author = {S. Kullback, R.A. Leibler},
journal = {The Annals of Mathematical Statistics},
year = {1951}
}
- Template Parameters
-
| InputDataType | Type of the input data (arma::colvec, arma::mat, arma::sp_mat or arma::cube). |
| OutputDataType | Type of the output data (arma::colvec, arma::mat, arma::sp_mat or arma::cube). |
Definition at line 45 of file kl_divergence.hpp.