Manoj Rao bio photo

Manoj Rao

Your Average Common Man

Email Twitter Github

Automatic Differentiation with autograd

Here we show how Automatic Differentiation can be set up using MXNet. This is super convenient way to set up backpropagation. Follow along and have fun!

Basic Usage:

from mxnet import nd
from mxnet import autograd
  • Differentiate $f(x) = 2x^2$
x = nd.array([[1,2], [3, 4]])
[[1. 2.]
 [3. 4.]]
<NDArray 2x2 @cpu(0)>
  • MXNet we can tell an NDArray that we plan to store a gradient by invoking it’s attach_grad() method.
  • Define the function $y = f(x)$ to let MXNet store $y$, so that we can computer gradients later.
  • Put the definition inside a autograd.record() scope.
with autograd.record():
    y = 2 * x * x

  • Invoke backpropagation by calling y.backward(). When y has more than one entry y.backward() is equivalent to y.sum().backward()
  • If $y = 2x^2$ then $\frac{dy}{dx} = 4x$
[[ 4.  8.]
 [12. 16.]]
<NDArray 2x2 @cpu(0)>
4 * x
[[ 4.  8.]
 [12. 16.]]
<NDArray 2x2 @cpu(0)>

Using Python control flows

Bring Your Own Cause

If you think any info here has remotely helped you consider dropping a penny for this cause, just click me . You can visit Unfortunately, there are plenty of sad things happening all over the world, if you have a different cause or charity you'd rather support please do. And if you did make a donation, please drop a note to me (annotated) or leave a comment here (anonymous is OK!) and I will use that as motivation to write more useful content here.

If you like topics such as this then please consider subscribing to my podcast. I talk to some of the stalwarts in tech and ask them what their favorite productivity hacks are:

Available on iTunes Podcast

Visit Void Star Podcast’s page on iTunes Podcast Portal. Please Click ‘Subscribe’, leave a comment.

Get it iTunes