Related to optimizer and loss function in transfer learning

If I change the requires_grad to True or False midway after a few training, do I need to reinitialize the optimizer or loss function?