import torch
Backpropagation
According to many smart people backpropagation is one of the main reasons why is AI so successful today. What’s even better from backpropagation is the automated backpropagation, or autograd. This is just an example showing how it works in pytorch:
= torch.randn(2, 3, 6, 6)
a
a.requires_grad_()= (a**2).sum()
loss
loss.backward()assert (a.grad == 2 * a).all()