Back to Annotated Deep Learning Paper Implementations

Train a ResNet on CIFAR 10

docs/resnet/experiment.html

latest2.4 KB
Original Source

homeresnet

View code on Github

#

Train a ResNet on CIFAR 10

10fromtypingimportList,Optional1112fromtorchimportnn1314fromlabmlimportexperiment15fromlabml.configsimportoption16fromlabml\_nn.experiments.cifar10importCIFAR10Configs17fromlabml\_nn.resnetimportResNetBase

#

Configurations

We use CIFAR10Configs which defines all the dataset related configurations, optimizer, and a training loop.

20classConfigs(CIFAR10Configs):

#

Number fo blocks for each feature map size

29n\_blocks:List[int]=[3,3,3]

#

Number of channels for each feature map size

31n\_channels:List[int]=[16,32,64]

#

Bottleneck sizes

33bottlenecks:Optional[List[int]]=None

#

Kernel size of the initial convolution layer

35first\_kernel\_size:int=3

#

Create model

38@option(Configs.model)39def\_resnet(c:Configs):

#

ResNet

44base=ResNetBase(c.n\_blocks,c.n\_channels,c.bottlenecks,img\_channels=3,first\_kernel\_size=c.first\_kernel\_size)

#

Linear layer for classification

46classification=nn.Linear(c.n\_channels[-1],10)

#

Stack them

49model=nn.Sequential(base,classification)

#

Move the model to the device

51returnmodel.to(c.device)

#

54defmain():

#

Create experiment

56experiment.create(name='resnet',comment='cifar10')

#

Create configurations

58conf=Configs()

#

Load configurations

60experiment.configs(conf,{61'bottlenecks':[8,16,16],62'n\_blocks':[6,6,6],6364'optimizer.optimizer':'Adam',65'optimizer.learning\_rate':2.5e-4,6667'epochs':500,68'train\_batch\_size':256,6970'train\_dataset':'cifar10\_train\_augmented',71'valid\_dataset':'cifar10\_valid\_no\_augment',72})

#

Set model for saving/loading

74experiment.add\_pytorch\_models({'model':conf.model})

#

Start the experiment and run the training loop

76withexperiment.start():77conf.run()

#

81if\_\_name\_\_=='\_\_main\_\_':82main()

labml.ai