Why would you ever not have requires_grad=True?

Why would you ever not have requires_grad=True?
Someone explain please.
Thank you,

1 Like

To save resources. Backpropagation takes a lot of calculations and requires_grad is what turns them on.

1 Like

Whenever you don’t want to track the history of a variable, you will set the require_grad=False, because setting this variable to True, would help you to track all the changes in the variable and also who have made these changes, so by using the history, you can use an optimizer to compute the gradient for the variable, and thus you can make your model learn. I know it a bit of an abstract view, but this is a short summary of the functioning of backpropagation. So you won’t use it in your hyperparameter and simple variable and scalars and we mainly use it for tensor in PyTorch.

1 Like

brother…
one more thing, how to plot that graph in which he explained us the gradient and all that stuff?
because if somehow i m able to do that, my learning rate would drastically increase :wink:

To perform effective transfer learning Transfer Learning.