JuliaDiff
Differentiation tools in Julia
Computing derivatives
Derivatives are required by many numerical algorithms, even for functions given as chunks of computer code rather than simple mathematical expressions. There are various ways to compute the gradient, Jacobian or Hessian of such functions without tedious manual labor:
Symbolic differentiation, which uses computer algebra to work out explicit formulas
Numerical differentiation, which relies on variants of the finite difference approximation
Automatic differentiation (AD), which reinterprets the code of either in "forward" or "reverse" mode
In machine learning, automatic differentiation is probably the most widely used paradigm, especially in reverse mode. However, each method has its own upsides and tradeoffs, which are detailed in the following papers, along with implementation techniques like "operator overloading" and "source transformation":
Automatic differentiation in machine learning: a survey, Baydin et al. (2018)
A review of automatic differentiation and its efficient implementation, Margossian (2019)
What is JuliaDiff?
JuliaDiff is an informal GitHub organization which aims to unify and document packages written in Julia for evaluating derivatives. The technical features of Julia[1] make implementing and using differentiation techniques easier than ever before (in our biased opinion).
Discussions on JuliaDiff and its uses may be directed to the Julia Discourse forum. The ChainRules project maintains a list of recommended reading for those after more information. The autodiff.org site serves as a portal for the academic community, though it is often out of date.
The Big List
What follows is a big list of Julia differentiation packages and related tooling, last updated in January 2024. If you notice something inaccurate or outdated, please open an issue to signal it. The packages marked as inactive are those which have had no release in 2023.
The list aims to be comprehensive in coverage. By necessity, this means it is not comprehensive in detail. It is worth investigating each package yourself to really understand its ins and outs, and the pros and cons of its competitors.
Reverse mode automatic differentiation
JuliaDiff/ReverseDiff.jl: Operator overloading AD backend
FluxML/Zygote.jl: Source transformation AD backend
EnzymeAD/Enzyme.jl: LLVM-level source transformation AD backend
FluxML/Tracker.jl: Operator overloading AD backend
compintell/Tapir.jl: Source transformation AD backend (experimental)
dfdx/Yota.jl: Source transformation AD backend
Forward mode automatic differentiation
JuliaDiff/ForwardDiff.jl: Operator overloading AD backend
JuliaDiff/PolyesterForwardDiff.jl: Multithreaded version of ForwardDiff.jl
EnzymeAD/Enzyme.jl: LLVM-level source transformation AD backend
JuliaDiff/Diffractor.jl: Source transformation AD backend (experimental)
Symbolic differentiation
JuliaSymbolics/Symbolics.jl: Pure Julia computer algebra system with support for fast and sparse analytical derivatives
brianguenter/FastDifferentiation.jl: Generate efficient executables for symbolic derivatives
Numeric differentiation
JuliaDiff/FiniteDifferences.jl: Finite differences with support for arbitrary types and higher order schemes
JuliaDiff/FiniteDiff.jl: Finite differences with support for caching and sparsity
Higher order
JuliaDiff/TaylorSeries.jl: Taylor polynomial expansions in one or more variables
JuliaDiff/TaylorDiff.jl: Higher order directional derivatives (experimental)
JuliaDiff/Diffractor.jl: Source transformation AD backend (experimental)
Interfaces
gdalle/DifferentiationInterface.jl: Generic interface for first- and second-order differentiation with any AD backend on 1-argument functions (
f(x) = y
orf!(y, x)
).JuliaDiff/AbstractDifferentiation.jl: Generic interface for first- and second-order differentiation with a subset of AD backends on functions with more than one argument (will soon wrap DifferentiationInterface.jl).
Rulesets
These packages define derivatives for basic functions, and enable users to do the same:
JuliaDiff/ChainRules: Ecosystem for backend-agnostic forward and reverse rules.
JuliaDiff/ChainRulesCore.jl: Core API for users to add rules to their package.
JuliaDiff/ChainRules.jl: Rules for Julia Base and standard libraries.
JuliaDiff/ChainRulesTestUtils.jl: Tools for testing rules defined with ChainRulesCore.jl.
ThummeTo/ForwardDiffChainRules.jl: Translate rules from ChainRulesCore.jl to make them compatible with ForwardDiff.jl
JuliaDiff/DiffRules.jl: Scalar rules used by e.g. ForwardDiff.jl, ReverseDiff.jl, Tracker.jl, and Symbolics.jl
EnzymeAD/EnzymeRules.jl: Rule definition API for Enzyme.jl
FluxML/ZygoteRules.jl: Some rules used by Zygote.jl (mostly deprecated in favor of ChainRules.jl).
Sparsity
JuliaDiff/SparseDiffTools.jl: Exploit sparsity to speed up FiniteDiff.jl and ForwardDiff.jl, as well as other algorithms.
adrhill/SparseConnectivityTracer.jl: Sparsity pattern detection for Jacobians and Hessians.
gdalle/SparseMatrixColorings.jl: Efficient coloring and and decompression algorithms for sparse Jacobians and Hessians.
Differentiating through more stuff
Some complex algorithms are not natively differentiable, which is why derivatives have been implemented in the following packages:
SciML: For a lot of different domains of scientific machine learning: differential equations, linear and nonlinear systems, optimization problems, etc.
gdalle/ImplicitDifferentiation.jl: For generic algorithms specified by output conditions, thanks to the implicit function theorem
jump-dev/DiffOpt.jl: For convex optimization problems
axelparmentier/InferOpt.jl: For combinatorial optimization problems
gaurav-arya/StochasticAD.jl: Differentiation of functions with stochastic behavior (experimental)
Inactive packages
[1] | namely, multiple dispatch, source code via reflection, just-in-time compilation, and first-class access to expression parsing |