Sometimes probabilities can be useful to communicate directly to your user and sometimes you can use them for building automated decision-making models. You could also try to optimize it directly but it’s basically much easier to think about a decision-making framework if you know the probabilities of the outcomes.

Having the actual probabilities can be an informative source for downstream use for example if you want to communicate to the user like a diagnosis to the patient saying the model predicted you don’t have cancer is very different from the model predicted you’re 34% likely to have cancer. If your threshold by the default is 50% these two statements will be different. If you tell someone you have 34% likely to have cancer it will very different to tell them they don’t have cancer.

When you call `model.predict`

you get an array of class probabilities. If the last layer is softmax then the probability is mutually exclusive. If all of the neurons in the last layer are sigmoid, it means that the results may have different labels, e.g. existence of a dog and cat in an image.

** model.predict_classes method is deprecated**. It has been removed after 2021-01-01. If you want to class labels (like a dog or a cat). How can you get them?

### Predict Class Label from Binary Classification

We have built a convolutional neural network that classifies the image into either a dog or a cat. We are training CNN with labels either 0 or 1. When you predict an image you get the following result.

```
y_pred=model.predict(np.expand_dims(img,axis=0))
#[[0.893292]]
```

You have predicted class probabilities. Since you are doing binary classification. You have a dense layer consisting of one unit with an activation function of the sigmoid. Sigmoid function outputs a value in the range [0,1] which corresponds to the probability of the given sample belonging to a positive class (i.e. class one).

To convert these to class labels you can take a threshold. Everything below 0.5 is labeled with Zero (i.e. negative class) and everything above 0.5 is labeled with One. So to find the predicted class you can do the following.

`np.where(y_pred > threshold, 1,0)`

### Predict Class from Multi-Class Classification

In multi-classes classification last layer use “**softmax**” activation, which means it will return an array of 10 probability scores (summing to 1) for 10 class. Each score will be the probability that the current class belongs to one of our 10 classes.

```
predictions = model.predict(test_images)
#array([0.08544677, 0.08544677, 0.0854468 , 0.08544677, 0.08544677,
0.08546585, 0.08544712, 0.08618792, 0.08544684, 0.23021841],
dtype=float32)
```

For multiclass classification where you want to assign one class from multiple possibilities, you can use `argmax`

.

```
class_names = ['T-shirt/top', 'Trouser', 'Pullover', 'Dress', 'Coat',
'Sandal', 'Shirt', 'Sneaker', 'Bag', 'Ankle boot']
class_names[np.argmax(predictions)]
```

To convert your class probabilities to class labels just let it through `argmax`

that will encode the highest probability as 1.

### 3. Predict Class from Multi-Label Classification

For multi-label classification where you can have multiple output classes per example. You can use thresholding again.

```
labels=["toxic", "severe_toxic", "obscene", "threat", "insult", "identity_hate"]
y_pred=[0.566,0.342,0.673,0.122,0.784,0.108]
class_labels=[labels[i] for i,prob in enumerate(y_pred) if prob > 0.5]
```

Model predict method output list of 6 float numbers representing probabilities to those 6 classes.