Struct nn::Trainer
[-] [+]
[src]
pub struct Trainer<'a, 'b> { // some fields omitted }
Used to specify options that dictate how a network will be trained
Methods
impl<'a, 'b> Trainer<'a, 'b>
Trainer
is used to chain together options that specify how to train a network.
All of the options are optional because the Trainer
struct
has default values built in for each option. The go()
method must
be called however or the network will not be trained.
fn rate(&mut self, rate: f64) -> &mut Trainer<'a, 'b>
Specifies the learning rate to be used when training (default is 0.3
)
This is the step size that is used in the backpropagation algorithm.
fn momentum(&mut self, momentum: f64) -> &mut Trainer<'a, 'b>
Specifies the momentum to be used when training (default is 0.0
)
fn log_interval(&mut self, log_interval: Option<u32>) -> &mut Trainer<'a, 'b>
Specifies how often (measured in batches) to log the current error rate (mean squared error) during training.
Some(x)
means log after every x
batches and None
means never log
fn halt_condition(&mut self, halt_condition: HaltCondition) -> &mut Trainer<'a, 'b>
Specifies when to stop training. Epochs(x)
will stop the training after
x
epochs (one epoch is one loop through all of the training examples)
while MSE(e)
will stop the training when the error rate
is at or below e
. Timer(d)
will halt after the duration d
has
elapsed.
fn learning_mode(&mut self, learning_mode: LearningMode) -> &mut Trainer<'a, 'b>
Specifies what mode to train the network in.
Incremental
means update the weights in the network after every example.
fn go(&mut self) -> f64
When go
is called, the network will begin training based on the
options specified. If go
does not get called, the network will not
get trained!