Back to Annotated Deep Learning Paper Implementations

Experiment

labml_nn/transformers/switch/experiment.ipynb

latest2.0 KB
Original Source

Switch Transformer

This is an experiment training Shakespeare dataset with a small Switch Transformer.

Install the labml-nn package

!pip install labml-nn

Imports

from labml import experiment
from labml_nn.transformers.switch.experiment import Configs

Create an experiment

experiment.create(name="switch_transformer")

Initialize configurations

conf = Configs()

Set experiment configurations and assign a configurations dictionary to override configurations

experiment.configs(conf,
                   # A dictionary of configurations to override
                   {'tokenizer': 'character',
                    'text': 'tiny_shakespeare',
                    'optimizer.learning_rate': 1.,
                    'optimizer.optimizer': 'Noam',
                    'prompt': 'It is',
                    'prompt_separator': '',

                    'transformer': 'switch_transformer',
                    'is_scale_prob': False,
                    'n_experts': 4,

                    'drop_tokens': True,
                    'capacity_factor': 1.2,

                    'train_loader': 'shuffled_train_loader',
                    'valid_loader': 'shuffled_valid_loader',

                    'seq_len': 64,
                    'epochs': 128,
                    'batch_size': 32,
                    'inner_iterations': 25,
                    })

Set PyTorch models for loading and saving

experiment.add_pytorch_models({'model': conf.model})

Start the experiment and run the training loop.

# Start the experiment
with experiment.start():
    conf.run()