ForwardDiff

ForwardDiff implements methods to take derivatives, gradients, Jacobians, Hessians, and higher-order derivatives of native Julia functions (or any callable object, really) using forward mode automatic differentiation (AD).

While performance can vary depending on the functions you evaluate, the algorithms implemented by ForwardDiff generally outperform non-AD algorithms in both speed and accuracy.

Wikipedia's automatic differentiation entry is a useful resource for learning about the advantages of AD techniques over other common differentiation methods (such as finite differencing).

ForwardDiff is a registered Julia package, so it can be installed by running:

julia> Pkg.add("ForwardDiff")

If you find ForwardDiff useful in your work, we kindly request that you cite our paper. The relevant BibLaTex is available in ForwardDiff's README (not included here because BibLaTex doesn't play nice with Documenter/Jekyll).