From 891f5bc4721bc83ef48c09ff6df4060558dcdb1e Mon Sep 17 00:00:00 2001 From: Eduardo Patrocinio Date: Tue, 25 Nov 2025 18:58:53 -0500 Subject: [PATCH] Fix polynomial_autograd documentation to match implementation MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit The tutorial code uses exp(x) but the documentation incorrectly stated it was learning sin(x). This fix updates the documentation to correctly describe what the code does: - Changed documentation from sin(x) from -π to π - To exp(x) from -1 to 1 - Added explanation about Taylor expansion of exponential function The code remains unchanged and continues to demonstrate fitting a polynomial to the exponential function using PyTorch autograd. --- beginner_source/examples_autograd/polynomial_autograd.py | 5 +++-- 1 file changed, 3 insertions(+), 2 deletions(-) diff --git a/beginner_source/examples_autograd/polynomial_autograd.py b/beginner_source/examples_autograd/polynomial_autograd.py index d33ca8bcb90..abbda9a420e 100755 --- a/beginner_source/examples_autograd/polynomial_autograd.py +++ b/beginner_source/examples_autograd/polynomial_autograd.py @@ -2,8 +2,9 @@ PyTorch: Tensors and autograd ------------------------------- -A third order polynomial, trained to predict :math:`y=\sin(x)` from :math:`-\pi` -to :math:`\pi` by minimizing squared Euclidean distance. +A third order polynomial, trained to predict :math:`y=e^x` from :math:`-1` +to :math:`1` by minimizing squared Euclidean distance. The exponential function +can be approximated by its Taylor expansion: :math:`e^x \approx 1 + x + \frac{x^2}{2} + \frac{x^3}{6} + \ldots` This implementation computes the forward pass using operations on PyTorch Tensors, and uses PyTorch autograd to compute gradients.