Tutorial
We present a typical workflow with DifferentiationInterfaceTest.jl, building on the tutorial of the DifferentiationInterface.jl documentation (which we encourage you to read first).
julia> using DifferentiationInterface, DifferentiationInterfaceTest
julia> import ForwardDiff, Zygote
Introduction
The AD backends we want to compare are ForwardDiff.jl and Zygote.jl.
backends = [AutoForwardDiff(), AutoZygote()]
2-element Vector{ADTypes.AbstractADType}:
AutoForwardDiff()
AutoZygote()
To do that, we are going to take gradients of a simple function:
f(x::AbstractArray) = sum(sin, x)
f (generic function with 1 method)
Of course we know the true gradient mapping:
∇f(x::AbstractArray) = cos.(x)
∇f (generic function with 1 method)
DifferentiationInterfaceTest.jl relies with so-called "scenarios", in which you encapsulate the information needed for your test:
- the operator category (
:gradient
) - the behavior of the operator (either
:in
or:out
of place) - the function
f
- the input
x
of the functionf
- the reference first-order result
res1
of the operator
xv = rand(Float32, 3)
xm = rand(Float64, 3, 2)
scenarios = [
Scenario{:gradient,:out}(f, xv; res1=∇f(xv)),
Scenario{:gradient,:out}(f, xm; res1=∇f(xm))
];
Testing
The main entry point for testing is the function test_differentiation
. It has many options, but the main ingredients are the following:
julia> test_differentiation( backends, # the backends you want to compare scenarios, # the scenarios you defined, correctness=true, # compares values against the reference type_stability=:none, # checks type stability with JET.jl detailed=true, # prints a detailed test set )
Test Summary: | Pass Total Time Testing correctness | 88 88 9.0s AutoForwardDiff() | 44 44 5.6s gradient | 44 44 5.6s Scenario{:gradient,:out} f : Vector{Float32} -> Float32 | 22 22 3.4s Scenario{:gradient,:out} f : Matrix{Float64} -> Float64 | 22 22 2.1s AutoZygote() | 44 44 3.3s gradient | 44 44 3.3s Scenario{:gradient,:out} f : Vector{Float32} -> Float32 | 22 22 2.6s Scenario{:gradient,:out} f : Matrix{Float64} -> Float64 | 22 22 0.7s
Benchmarking
Once you are confident that your backends give the correct answers, you probably want to compare their performance. This is made easy by the benchmark_differentiation
function, whose syntax should feel familiar:
df = benchmark_differentiation(backends, scenarios);
Row | backend | scenario | operator | prepared | calls | samples | evals | time | allocs | bytes | gc_fraction | compile_fraction |
---|---|---|---|---|---|---|---|---|---|---|---|---|
Abstract… | Scenario… | Symbol | Bool | Int64 | Int64 | Int64 | Float64 | Float64 | Float64 | Float64 | Float64 | |
1 | AutoForwardDiff() | Scenario{:gradient,:out} f : Vector{Float32} -> Float32 | value_and_gradient | true | 1 | 52733 | 199 | 5.47739e-8 | 3.0 | 112.0 | 0.0 | 0.0 |
2 | AutoForwardDiff() | Scenario{:gradient,:out} f : Vector{Float32} -> Float32 | gradient | true | 1 | 29245 | 619 | 4.51567e-8 | 2.0 | 80.0 | 0.0 | 0.0 |
3 | AutoForwardDiff() | Scenario{:gradient,:out} f : Matrix{Float64} -> Float64 | value_and_gradient | true | 1 | 28471 | 205 | 1.3982e-7 | 4.0 | 192.0 | 0.0 | 0.0 |
4 | AutoForwardDiff() | Scenario{:gradient,:out} f : Matrix{Float64} -> Float64 | gradient | true | 1 | 33181 | 197 | 1.26426e-7 | 3.0 | 160.0 | 0.0 | 0.0 |
5 | AutoZygote() | Scenario{:gradient,:out} f : Vector{Float32} -> Float32 | value_and_gradient | true | 1 | 29506 | 34 | 8.25941e-7 | 24.0 | 672.0 | 0.0 | 0.0 |
6 | AutoZygote() | Scenario{:gradient,:out} f : Vector{Float32} -> Float32 | gradient | true | 1 | 30013 | 45 | 6.26044e-7 | 22.0 | 608.0 | 0.0 | 0.0 |
7 | AutoZygote() | Scenario{:gradient,:out} f : Matrix{Float64} -> Float64 | value_and_gradient | true | 1 | 29449 | 83 | 3.37976e-7 | 10.0 | 464.0 | 0.0 | 0.0 |
8 | AutoZygote() | Scenario{:gradient,:out} f : Matrix{Float64} -> Float64 | gradient | true | 1 | 29181 | 87 | 3.23943e-7 | 10.0 | 464.0 | 0.0 | 0.0 |
The resulting object is a DataFrame
from DataFrames.jl, whose columns correspond to the fields of DifferentiationBenchmarkDataRow
: