docs/utils/index.html
[View code on Github](https://github.com/labmlai/annotated_deep_learning_paper_implementations/tree/master/labml_nn/utils/ init.py)
10importcopy11fromtorchimportnn12fromtorch.utils.dataimportDataset,IterableDataset
Make a nn.ModuleList with clones of a given module
16defclone\_module\_list(module:nn.Module,n:int)-\>nn.ModuleList:
22returnnn.ModuleList([copy.deepcopy(module)for\_inrange(n)])
Infinite loader that recycles the data loader after each epoch
25defcycle\_dataloader(data\_loader):
33whileTrue:34forbatchindata\_loader:35yieldbatch
This converts an IterableDataset to a map-style dataset so that we can shuffle the dataset.
This only works when the dataset size is small and can be held in memory.
38classMapStyleDataset(Dataset):
51def\_\_init\_\_(self,dataset:IterableDataset):
Load the data to memory
53self.data=[dfordindataset]
Get a sample by index
55def\_\_getitem\_\_(self,idx:int):
57returnself.data[idx]
Create an iterator
59def\_\_iter\_\_(self):
61returniter(self.data)
Size of the dataset
63def\_\_len\_\_(self):
65returnlen(self.data)