Hello, I just finished going through the first note book 01-pytorch-basics and I was trying some stuff when I encountered this warning. I am just a beginner in pytorch.
What I did upto this point:
- The next thing I did was to follow the github link and found this solution.
.retain_grad()if you want the gradient for a non-leaf Tensor. Or make sure you have the leaf Tensor if your have a non-leaf Tensor by mistake.
- Then I searched about leaf and non leaf tensors and I could not find enough information, would be great if someone helps me with this also.
Steps to reproduce:
/srv/conda/envs/notebook/lib/python3.7/site-packages/torch/tensor.py:746: UserWarning: The .grad attribute of a Tensor that is not a leaf Tensor is being accessed. Its .grad attribute won’t be populated during autograd.backward(). If you indeed want the gradient for a non-leaf Tensor, use .retain_grad() on the non-leaf Tensor. If you access the non-leaf Tensor by mistake, make sure you access the leaf Tensor instead. See github.com/pytorch/pytorch/pull/30531 for more informations.
warnings.warn("The .grad attribute of a Tensor that is not a leaf Tensor is being accessed. Its .grad "