Differentiation tools in Julia

Computing derivatives

Derivatives are required by many numerical algorithms, even for functions ff given as chunks of computer code rather than simple mathematical expressions. There are various ways to compute the gradient, Jacobian or Hessian of such functions without tedious manual labor:

  • Symbolic differentiation, which uses computer algebra to work out explicit formulas

  • Numerical differentiation, which relies on variants of the finite difference approximation f(x)f(x+ε)f(x)εf'(x) \approx \frac{f(x+\varepsilon) - f(x)}{\varepsilon}

  • Automatic differentiation (AD), which reinterprets the code of ff either in "forward" or "reverse" mode

In machine learning, automatic differentiation is probably the most widely used paradigm, especially in reverse mode. However, each method has its own upsides and tradeoffs, which are detailed in the following papers, along with implementation techniques like "operator overloading" and "source transformation":

Automatic differentiation in machine learning: a survey, Baydin et al. (2018)

A review of automatic differentiation and its efficient implementation, Margossian (2019)

What is JuliaDiff?

JuliaDiff is an informal GitHub organization which aims to unify and document packages written in Julia for evaluating derivatives. The technical features of Julia[1] make implementing and using differentiation techniques easier than ever before (in our biased opinion).

Discussions on JuliaDiff and its uses may be directed to the Julia Discourse forum. The ChainRules project maintains a list of recommended reading for those after more information. The autodiff.org site serves as a portal for the academic community, though it is often out of date.

The Big List

What follows is a big list of Julia differentiation packages and related tooling, last updated in January 2024. If you notice something inaccurate or outdated, please open an issue to signal it. The packages marked as inactive are those which have had no release in 2023.

The list aims to be comprehensive in coverage. By necessity, this means it is not comprehensive in detail. It is worth investigating each package yourself to really understand its ins and outs, and the pros and cons of its competitors.

Reverse mode automatic differentiation

Forward mode automatic differentiation

Symbolic differentiation

Numeric differentiation

Higher order


  • gdalle/DifferentiationInterface.jl: Generic interface for first- and second-order differentiation with any AD backend on 1-argument functions (f(x) = y or f!(y, x)).

  • JuliaDiff/AbstractDifferentiation.jl: Generic interface for first- and second-order differentiation with a subset of AD backends on functions with more than one argument (will soon wrap DifferentiationInterface.jl).


These packages define derivatives for basic functions, and enable users to do the same:


Differentiating through more stuff

Some complex algorithms are not natively differentiable, which is why derivatives have been implemented in the following packages:

Inactive packages

[1] namely, multiple dispatch, source code via reflection, just-in-time compilation, and first-class access to expression parsing