Skip to content

Latest commit

 

History

History
53 lines (46 loc) · 2.25 KB

README.md

File metadata and controls

53 lines (46 loc) · 2.25 KB

Keras BatchSizeAnnealing

This repository contains a wrapper class for adjusting the batch_size after aech epoch as shown on the paper Don't Decay the Learning Rate, Increase the Batch Size by by Samuel L. Smith, Pieter-Jan Kindermans, Chris Ying, Quoc V. Le.

Train Example

A minimum example of working code would be:

from BatchSizeAnnealing import BatchSizeAnnealing
from keras.datasets import mnist
(x_train, y_train), (x_test, y_test) = mnist.load_data()

def callback(epoch):
    return 32 * epoch / 10 + 32

model = createModel()
trainer = BatchSizeAnnealing(model, callback)
history = trainer.train( x_train, y_train, validation_data=(x_test, y_test), epochs=EPOCH, verbose=1)

The constructor takes as arguments the model and the callback to the annealing per epoch.

Wrapper Example

This class can also be used as a wrapper for keras.model as it will redirect all methods to the model passed as parameter. ie:

from BatchSizeAnnealing import BatchSizeAnnealing
from keras.datasets import mnist
(x_train, y_train), (x_test, y_test) = mnist.load_data()

def callback(epoch):
    return 32 * epoch / 10 + 32

model = createModel()
trainer = BatchSizeAnnealing(model, callback)
trainer.summary()

This last funtion is equivalent to to model.summary()

Constuctor parameters

class BatchSizeAnnealing(object):
    def __init__(self, model, callback, show_hist=False, keep_verbosity=False):
        ...
  • model: The model to be used as training (Can be from the functional API).
  • callback: Callback to get batch sizes in a specific epoch:
    def callback(epoch):
        return ...
  • show_hist: Show progress as bar while training,
  • keep_verbosity: keep the "verbosity" parameter passed to train.

TODO

  • ☑ Add verbose for training every batch.
  • ☐ Fix verbosity per epoch from keras.model.
  • ☐ Add parameter and return types.