Category Archives: PyTorch
Difference Between Contiguous and Non-Contiguous Array
Contiguous memory allocation, all the free memory space can stay in the same place concurrently. In non-contiguous memory allocation, the operation is allowed to different memory sections at multiple memory positions in the memory.
one of the variables needed for gradient computation has been modified by an inplace operation
If you try to backward you will get an error. It is just harder to detect during the forward. The .backward() did NOT catch the in-place operation on a tensor that is in the forward computation graph. It only detects it during the backward.