Skip to content
Mathieu Germain edited this page Jan 27, 2015 · 2 revisions

Optimizer

We can have different optimizer like, Adagrad, Adadelta, RMSProp, Adam. And all of them should be resumable and provide the their command line arg parser, maybe.

Task

StoppingCriterion

Trainer

  • add_task(task, epoch_freq=1, iteration_freq=None)
  • add_stopping_criterion(stopping_criterion)

Model's API

parameters

Property of a model's instance, so we can get model's parameters.

get_gradients()

Returns an ordered dictionary of either Theano's variable or numpy's arrays. What should we use as dictionary keys? (Theano's variables / strings)

update(new_params)

Updates model's parameters using the ordered dictionary received in argument.

Clone this wiki locally