autograd
A collection of automatic differentiation tools for operations on ndarrays.
Usage
First, install the library using npm:
npm install autograd
Then you can import the library by doing:
const autograd = ;
Then you can use the functions as in the following example:
const autograd = ;const ndarray = ;const ops = ; // re-create multiplication as differentiable function.const mul = autograd; // call the op with constantsconst x1 = ;const x2 = ; let y = ;// y = [8 8 8 ...] // ... // call the op with variables!// create differentiable variables to track gradients.const varx1 = autograd;const varx2 = autograd; y = ;// y = [8 8 8 ...] // with variables, access to gradients via `<variable>.grad` is available.// call backward to compute partial derivatives of `y` w.r.t each of it's inputs.autograd; // backward also accepts input gradient information from external sources.// this by default makes `y.grad` === the gradient input.// Without a gradient input, `y.grad` === dy/dy === [1 1 1 ...].autograd; // ** pulls out Calculus book **// At a glance, `autograd.backward(y)` does the following://// y = x1 * x2// dy/dx1 = d(y) / dx1// dy/dx1 = d(x1 * x2) / dx1 (x2 treated as constant)// dy/dx1 = (dx1 * x2) / dx1 (take the derivative w.r.t x1)// dy/dx1 = x2 (dx1/dx1 = 1)//// asserting the identity of our partial derivatives,// (dy/dy) * (dy/dx1) = (dy/dx1)