Back to Annotated Deep Learning Paper Implementations

Utilities for Transformer

docs/transformers/utils.html

latest703 B
Original Source

hometransformers

View code on Github

#

Utilities for Transformer

10importtorch

#

Subsequent mask to mask out data from future (subsequent) time steps

13defsubsequent\_mask(seq\_len):

#

17mask=torch.tril(torch.ones(seq\_len,seq\_len)).to(torch.bool).unsqueeze(-1)18returnmask

#

21def\_subsequent\_mask():22fromlabml.loggerimportinspect23inspect(subsequent\_mask(10)[:,:,0])242526if\_\_name\_\_=='\_\_main\_\_':27\_subsequent\_mask()

labml.ai