docs/normalization/instance_norm/experiment.html
homenormalizationinstance_norm
This demonstrates the use of an instance normalization layer in a convolutional neural network for classification. Not that instance normalization was designed for style transfer and this is only a demo.
16importtorch.nnasnn1718fromlabmlimportexperiment19fromlabml.configsimportoption20fromlabml\_nn.experiments.cifar10importCIFAR10Configs,CIFAR10VGGModel21fromlabml\_nn.normalization.instance\_normimportInstanceNorm
This derives from the generic VGG style architecture.
24classModel(CIFAR10VGGModel):
31defconv\_block(self,in\_channels,out\_channels)-\>nn.Module:32returnnn.Sequential(33nn.Conv2d(in\_channels,out\_channels,kernel\_size=3,padding=1),34InstanceNorm(out\_channels),35nn.ReLU(inplace=True),36)
38def\_\_init\_\_(self):39super().\_\_init\_\_([[64,64],[128,128],[256,256,256],[512,512,512],[512,512,512]])
42@option(CIFAR10Configs.model)43def\_model(c:CIFAR10Configs):
47returnModel().to(c.device)
50defmain():
Create experiment
52experiment.create(name='cifar10',comment='instance norm')
Create configurations
54conf=CIFAR10Configs()
Load configurations
56experiment.configs(conf,{57'optimizer.optimizer':'Adam',58'optimizer.learning\_rate':2.5e-4,59})
Start the experiment and run the training loop
61withexperiment.start():62conf.run()
66if\_\_name\_\_=='\_\_main\_\_':67main()