June 16, 2023
Where should place Dropout, Batch Normalization, and Activation Layer
PragatiDropout and Batch Normalization often lead to a worse performance when they are combined together in many modern neural networks, but cooperate well sometimes as in Wide ResNet (WRN).