In order to augment the dataset, we apply various transformation techniques. These include the crop, resize, rotation, translation, flip and so on. In the pyTorch, those operations are defined in the ‘torchvision.transforms’ package and we can choose some of those transformations when it is needed.
The defined transforms in figure 1 with Resize, RandomHorizontalFlip, and Normalize are applied to the original dataset at every batch generation. This means that the original dataset in local directory remains unchanged but we get the transformed data for the batch in every training step.
According to my experience with the ImageFolder class, it supports a powerful feature in composing the batch dataset. In most cases, when we build the batch dataset, arranging input data and its corresponding label in pairs is done manually. With the ImageFolder, however, this can be done much easier if the dataset is composed of images.
For example, if the directory is given as is shown in figure 2 we only need to initialize an instance of ImageFolder class with the root. Then, there is no need to do the above-mentioned dataset-building(input-label pair) work by yourself. The instance will return the randomly selected and transformed batch with corresponding labels.
DataLoader of torch.utils.data package is what actually returns the batch given the transformations and data directory that we set with the above Transform and ImageFolder class.
- num_workers: PyTorch provides a straightforward way to perform multi-process data loading by simply setting the argument ‘num_workers’ with a positive integer. However, in my case, I had to add one line ‘torch.multiprocessing.freeze_support()’ at the very first of my main function. Otherwise, it only showed the runtime error.
- The way to get one batch from the DataLoader
As the DataLoader is an iterable class, it returns multiple batches not a single one when we simply call it. In order to get only one batch per call, we can use the following code.
Any corrections, suggestions, and comments are welcome
Contents of this article are reproduced based on Bishop and Goodfellow