An F# library for mainly Automatic Differentiation (AD). You can define any kind of mathematical function and calculate derivatives and gradients for scalars and vectors respectively.

You can also compose cascaded functions and calculate derivatives with respect to both inputs and intermediate variables using reverse AD. This is the generalized form of Backpropagation.

You can calculate partial derivatives of all the weights (in other words gradients of weight matrices) in a Neural Network using this method.