Is there a way in PyTorch to stop the epochs runs through what TensorFlow uses as Callbacks?

Is there a way in PyTorch to stop the epochs runs through what TensorFlow uses as Callbacks?