Login/Sign Up


An F# library for mainly Automatic Differentiation (AD). You can define any kind of mathematical function and calculate derivatives and gradients for scalars and vectors respectively.

You can also compose cascaded functions and calculate derivatives with respect to both inputs and intermediate variables using reverse AD. This is the generalized form of Backpropagation.

You can calculate partial derivatives of all the weights (in other words gradients of weight matrices) in a Neural Network using this method.

I just checked it out. This is really a comprehensive library. Same author is on a new neural network library called Hype. I am looking forward to it.

This made me seriously think about going for F# for machine learning. I have been using R but, now I think F# has numerious advantages over both R and Haskell.