Normal distributions are important in statistics and often used to represent real random variables whose distributions are not known. It is commonly referred to as a Probability Density Function (pdf).

The normal distribution is sometimes informally called a **bell curve** because the majority of the values would be clustered along the average or mean value. The number of values towards the upper or lower extreme would be very few in any given random sample.

Normal distribution or Gaussian distribution is a type of continuous probability distribution for a real random variable. The general form of its probability density function is:

The parameter μ is the mean of the distribution and also its median and mode, while the parameter σ is its standard deviation. The variance of the distribution is σ^{2.}

The normal distribution represents the behavior of many common random variables like height/weight of people, marks scored by students on an exam, performance rating of employees, etc. Let’s plot a normal distribution for the Height of Men using PyTorch:

```
import torch
import matplotlib.pyplot as plt
import seaborn as sns
mean, std = 175,6
# create normal distribution
data= torch.normal(mean=mean, std=std, size=([10000]))
sns.histplot(data.numpy(), bins=30, kde=True, stat='probability')
plt.xlabel('Height (Men)')
plt.ylabel('Probability')
plt.title("Distribution of Men's Height")
plt.xticks(range(155,200,5))
plt.show()
```

We have plotted a normal distribution for the height of men with data generated using the mean height of 175 cm and a standard deviation of 6 cm. The probability of the random variable falling within 1 standard deviation of the mean is 68%.

The average of many samples of a random variable with finite mean and variance is a random variable—whose distribution converges to a normal distribution as the number of samples increases.

Mean and standard deviation are the defining parameters of a normal distribution: An increase in the **mean** will displace the curve towards the right and a decrease will displace it towards the left.

```
mean, std = 150, 6
# create normal distribution
data= torch.normal(mean=mean, std=std, size=([10000]))
sns.histplot(data.numpy(), bins=30, kde=True, stat='probability')
plt.xlabel('Height (Men)')
plt.ylabel('Probability')
plt.title("Distribution of Men's Height")
plt.xticks(range(155,200,5))
```

The standard deviation is a measure of the width of the curve. An increase in** standard deviation** would expand the curve along the x-axis, making it flatter and a decrease in standard deviation would compress the curve, making it steep.

```
mean, std = 175, 2
# create normal distribution
data= torch.normal(mean=mean, std=std, size=([10000]))
sns.histplot(data.numpy(), bins=30, kde=True, stat='probability')
plt.xlabel('Height (Men)')
plt.ylabel('Probability')
plt.title("Distribution of Men's Height")
plt.xticks(range(155,200,5))
plt.show()
```

### Related Post

Normalize PyTorch batch of tensors between 0 and 1 using scikit-learn MinMaxScaler

Get Normal/Uniform distribution in range[r1,r2] in PyTorch

How to Scale Data into the 0-1 range using Min-Max Normalization?