This is an analysis of different data augmentations techniques in Torchvision evaluated on CIFAR10. You can find the accompanying article on my Medium page.
-
Plain - only
Normalize()
operation. -
Baseline -
HorizontalFlipping()
,RandomCrop()
,RandomErasing()
. -
AutoAugment -
AutoAugment
policy for CIFAR10 applied on the top of Baseline configuration.
from augmentations import GetAugment
plain, baseline, autoaugment = GetAugment()
from cifar10 import LoadDataset
trainloader, valloader, testloader = LoadDataset(batch, normalization, augmentations)
from resnet import ResNet
n = 3
resnet20 = ResNet(n)
from training_functions import Network
network = Network(model=ResNet(3), learning_rate=0.01, device="cuda")
network.train_step(trainloader)
from plots import plot
plot([model1_train_loss, model1_val_loss, model2_train_loss, model2_val_los], "Loss")
train.py - combines all of the files above and train three different configurations