Skip to content

Commit 7e70fc2

Browse files
committed
Fix backpropagation order in optimization tutorial
Reorder optimizer.zero_grad(), loss.backward(), and optimizer.step() to match the recommended best practice documented in the tutorial. Fixes #3507
1 parent f99e9e8 commit 7e70fc2

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

beginner_source/basics/optimization_tutorial.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -158,9 +158,9 @@ def train_loop(dataloader, model, loss_fn, optimizer):
158158
loss = loss_fn(pred, y)
159159

160160
# Backpropagation
161+
optimizer.zero_grad()
161162
loss.backward()
162163
optimizer.step()
163-
optimizer.zero_grad()
164164

165165
if batch % 100 == 0:
166166
loss, current = loss.item(), batch * batch_size + len(X)

0 commit comments

Comments
 (0)