Do we need to detach loss and acc?

Do we need to detach loss and acc??

    def validation_step(self, batch):
        images, labels = batch 
        out = self(images)                    # Generate predictions
        loss = F.cross_entropy(out, labels)   # Calculate loss
        acc = accuracy(out, labels)           # Calculate accuracy
        return {'val_loss': loss.detach(), 'val_acc': acc.detach()}

Hi @mohdaqidkhat98 I may be wrong on this but I think that we only need to detach the loss. In fact we use .detach() to tell to PyTorch that we do not need to compute gradients, so it should not track those ops. In fact, we are in the validation step and not in the training one, so we are not back-propagating anything. Moreover, since in the accuracy no gradients are computed we do not need to call .detach(). For further information I ll share this tread of the PyTorch forum: https://discuss.pytorch.org/t/how-does-detach-work/2308
I hope I answered your question. :call_me_hand:

1 Like