item_tmfsis applied to each individual image before its copied to GPU. And it ensures three things, that all images are the same size and on the training set, the crop area is chosen randomly and the validation set, the center square of the image is chosen.
batch_tfmsis applied to a batch all at once on the GPU.
datablockis a generic container to quickly build ‘Datasets’ and ‘DataLoaders’ .
get_image_files, how to split the images, how to get the labels and any transformations to be applied.
Cross-Entropy loss is use of negative loss on probabilities. Or in simple terms Cross-Entropy loss is a combination of using the negative log likelihood on the log values of the probabilities from the softmax function.
Exponential function (exp) is defined as e**x, where e is a special number approximately equal to 2.718. It is the inverse of the natural logarithm function. Note that exp is always positive, and it increases very rapidly!