January 21, 2023

## Activation function for Output Layer in Regression, Binary, Multi-Class, and Multi-Label Classification

adminThe ReLU activation function is a default choice for the hidden layers. For the output layer, in general, you will want the logistics activation function for binary classification, the softmax activation function for multiclass classification, and no activation function for regression.