Tutorial
We present a typical workflow with DifferentiationInterfaceTest.jl, building on the tutorial of the DifferentiationInterface.jl documentation (which we encourage you to read first).
julia> using DifferentiationInterface, DifferentiationInterfaceTest
julia> import ForwardDiff, Zygote
Introduction
The AD backends we want to compare are ForwardDiff.jl and Zygote.jl.
backends = [AutoForwardDiff(), AutoZygote()]
2-element Vector{ADTypes.AbstractADType}:
AutoForwardDiff()
AutoZygote()
To do that, we are going to take gradients of a simple function:
f(x::AbstractArray) = sum(sin, x)
f (generic function with 1 method)
Of course we know the true gradient mapping:
∇f(x::AbstractArray) = cos.(x)
∇f (generic function with 1 method)
DifferentiationInterfaceTest.jl relies with so-called "scenarios", in which you encapsulate the information needed for your test:
- the operator category (
:gradient
) - the behavior of the operator (either
:in
or:out
of place) - the function
f
- the input
x
of the functionf
- the reference first-order result
res1
of the operator
xv = rand(Float32, 3)
xm = rand(Float64, 3, 2)
scenarios = [
Scenario{:gradient,:out}(f, xv; res1=∇f(xv)),
Scenario{:gradient,:out}(f, xm; res1=∇f(xm))
];
Testing
The main entry point for testing is the function test_differentiation
. It has many options, but the main ingredients are the following:
julia> test_differentiation( backends, # the backends you want to compare scenarios, # the scenarios you defined, correctness=true, # compares values against the reference type_stability=:none, # checks type stability with JET.jl detailed=true, # prints a detailed test set )
Test Summary: | Pass Total Time Testing correctness | 96 96 10.3s AutoForwardDiff() | 48 48 6.7s gradient | 48 48 6.7s Scenario{:gradient,:out} f : Vector{Float32} -> Float32 | 24 24 3.5s Scenario{:gradient,:out} f : Matrix{Float64} -> Float64 | 24 24 2.3s AutoZygote() | 48 48 3.6s gradient | 48 48 3.5s Scenario{:gradient,:out} f : Vector{Float32} -> Float32 | 24 24 2.7s Scenario{:gradient,:out} f : Matrix{Float64} -> Float64 | 24 24 0.9s
Benchmarking
Once you are confident that your backends give the correct answers, you probably want to compare their performance. This is made easy by the benchmark_differentiation
function, whose syntax should feel familiar:
df = benchmark_differentiation(backends, scenarios);
Row | backend | scenario | operator | prepared | calls | samples | evals | time | allocs | bytes | gc_fraction | compile_fraction |
---|---|---|---|---|---|---|---|---|---|---|---|---|
Abstract… | Scenario… | Symbol | Bool | Int64 | Int64 | Int64 | Float64 | Float64 | Float64 | Float64 | Float64 | |
1 | AutoForwardDiff() | Scenario{:gradient,:out} f : Vector{Float32} -> Float32 | value_and_gradient | true | 1 | 27248 | 510 | 4.99353e-8 | 3.0 | 112.0 | 0.0 | 0.0 |
2 | AutoForwardDiff() | Scenario{:gradient,:out} f : Vector{Float32} -> Float32 | gradient | true | 1 | 34129 | 508 | 4.11791e-8 | 2.0 | 80.0 | 0.0 | 0.0 |
3 | AutoForwardDiff() | Scenario{:gradient,:out} f : Matrix{Float64} -> Float64 | value_and_gradient | true | 1 | 22320 | 205 | 1.26434e-7 | 4.0 | 192.0 | 0.0 | 0.0 |
4 | AutoForwardDiff() | Scenario{:gradient,:out} f : Matrix{Float64} -> Float64 | gradient | true | 1 | 38790 | 171 | 1.1407e-7 | 3.0 | 160.0 | 0.0 | 0.0 |
5 | AutoZygote() | Scenario{:gradient,:out} f : Vector{Float32} -> Float32 | value_and_gradient | true | 1 | 28493 | 36 | 7.66417e-7 | 25.0 | 688.0 | 0.0 | 0.0 |
6 | AutoZygote() | Scenario{:gradient,:out} f : Vector{Float32} -> Float32 | gradient | true | 1 | 35447 | 37 | 5.88378e-7 | 23.0 | 624.0 | 0.0 | 0.0 |
7 | AutoZygote() | Scenario{:gradient,:out} f : Matrix{Float64} -> Float64 | value_and_gradient | true | 1 | 27937 | 92 | 2.93924e-7 | 10.0 | 464.0 | 0.0 | 0.0 |
8 | AutoZygote() | Scenario{:gradient,:out} f : Matrix{Float64} -> Float64 | gradient | true | 1 | 32962 | 81 | 2.83864e-7 | 10.0 | 464.0 | 0.0 | 0.0 |
The resulting object is a DataFrame
from DataFrames.jl, whose columns correspond to the fields of DifferentiationBenchmarkDataRow
: