Classes | |
| class | Adadelta |
| An adaptive learning rate Trainer (Zeiler). More... | |
| class | Backpropagation |
| A classic SGDTrainer. More... | |
| class | Layer |
| A layer in a NeuralNet. More... | |
| class | NeuralNet |
| A neural network. More... | |
| class | Neuron |
| A neuron in a Layer of a NeuralNet. More... | |
| class | Pruner |
| Removes unnecessary neurons from a NeuralNet. More... | |
| class | SGDTrainer |
| A classic backpropagation SGD Trainer. More... | |
| class | Trainer |
| Trains neural networks. More... | |
Typedefs | |
| typedef double(* | ActivationFunction )(double) |
| The type of an activation function. More... | |
Functions | |
| double | sigmoid (double initialOutput) |
| A sigmoid function. More... | |
| double | binary (double initialOutput) |
| A binary function. More... | |
| double | integer (double initialOutput) |
| An flooring function. More... | |
| double | simpleLinear (double initialOutput) |
| A linear function. More... | |
| double | tanSigmoid (double initialOutput) |
| A tangential sigmoid or a hyperbolic tangent function. More... | |
| double | sigmoidDerivative (double neuronOuput) |
| Derivative of the sigmoid activation function. More... | |
| double | simpleLinearDerivative (double neuronOutput) |
| Derivative of the linear activation function. More... | |
| double | tanSigmoidDerivative (double neuronOutput) |
| typedef double(* net::ActivationFunction)(double) |
The type of an activation function.
Each activation function must take in a double and output a double.
|
inline |
A binary function.
|
inline |
An flooring function.
|
inline |
A sigmoid function.
An "s-shaped" function that uses an activation value of 0 to return a gradient output. Similar to the hyperbolic tangent function.
|
inline |
Derivative of the sigmoid activation function.
|
inline |
A linear function.
|
inline |
Derivative of the linear activation function.
|
inline |
A tangential sigmoid or a hyperbolic tangent function.
An "s-shaped" function that uses an activation value of 0 to return a gradient output. Similar to the sigmoid function
|
inline |
1.8.6