# Apache MXNet - Crash Course Part 3

Here we show how Automatic Differentiation can be set up using MXNet. This is super convenient way to set up backpropagation. Follow along and have fun!

### Basic Usage:

from mxnet import nd

• Differentiate $f(x) = 2x^2$
x = nd.array([[1,2], [3, 4]])
x

[[1. 2.]
[3. 4.]]
<NDArray 2x2 @cpu(0)>

• MXNet we can tell an NDArray that we plan to store a gradient by invoking it’s attach_grad() method.
x.attach_grad()

• Define the function $y = f(x)$ to let MXNet store $y$, so that we can computer gradients later.
• Put the definition inside a autograd.record() scope.
with autograd.record():
y = 2 * x * x


• Invoke backpropagation by calling y.backward(). When y has more than one entry y.backward() is equivalent to y.sum().backward()
y.backward()

• If $y = 2x^2$ then $\frac{dy}{dx} = 4x$
x.grad

[[ 4.  8.]
[12. 16.]]
<NDArray 2x2 @cpu(0)>

4 * x

[[ 4.  8.]
[12. 16.]]
<NDArray 2x2 @cpu(0)>


## Using Python control flows