Advantages of ReLU vs Tanh vs Sigmoid activation function in deep neural networks.
The saturated neurons can kill gradients if we’re too positive or too negative of an input. They’re also not zero-centered and so we get these, this inefficient kind of gradient update. The third problem is an exponential function. This is a little bit computationally expensive.
Find Correlation between features and target using the correlation matrix.
You can evaluate the relationship between each feature and target using a correlation and selecting those features that have the strongest relationship with the target variable. Such as methods that remove redundant variables using correlation.
Plot two overlay Histograms on single chart with Pandas and Matplotlib.
We’ll look at a real-world example with data that I’ll load from a CSV file and also we’re going to learn how to draw overlapping histograms using Pandas and Matplotlib. Before we go and set up how we create the chart let’s look at what the end results would be.
How to copy PyTorch Tensor using clone, detach, and deepcopy?
we want to make a copy of a tensor and ensure that any operations are done with the cloned tensor to ensure that the gradients are propagated to the original tensor, we must use clone(). We should use detach() when we don’t want to include a tensor in the resulting computational graph.