Tutorial

We present a typical workflow with DifferentiationInterfaceTest.jl, building on the tutorial of the DifferentiationInterface.jl documentation (which we encourage you to read first).

julia> using DifferentiationInterface, DifferentiationInterfaceTest
julia> using ForwardDiff: ForwardDiff
julia> using Zygote: Zygote

Introduction

The AD backends we want to compare are ForwardDiff.jl and Zygote.jl.

backends = [AutoForwardDiff(), AutoZygote()]
2-element Vector{ADTypes.AbstractADType}:
 AutoForwardDiff()
 AutoZygote()

To do that, we are going to take gradients of a simple function:

f(x::AbstractArray) = sum(sin, x)
f (generic function with 1 method)

Of course we know the true gradient mapping:

∇f(x::AbstractArray) = cos.(x)
∇f (generic function with 1 method)

DifferentiationInterfaceTest.jl relies with so-called Scenarios, in which you encapsulate the information needed for your test:

  • the operator category (here :gradient)
  • the behavior of the operator (either :in or :out of place)
  • the function f
  • the input x of the function f (and possible tangents or contexts)
  • the reference first-order result res1 (and possible second-order result res2) of the operator
  • the arguments prep_args passed during preparation
xv = rand(Float32, 3)
xm = rand(Float64, 3, 2)
scenarios = [
    Scenario{:gradient,:out}(f, xv; res1=∇f(xv)),
    Scenario{:gradient,:out}(f, xm; res1=∇f(xm)),
];

Testing

The main entry point for testing is the function test_differentiation. It has many options, but the main ingredients are the following:

julia> test_differentiation(
           backends,  # the backends you want to compare
           scenarios;  # the scenarios you defined,
           correctness=true,  # compares values against the reference
           type_stability=:none,  # checks type stability with JET.jl
           detailed=true,  # prints a detailed test set
       )Test Summary:                                                 | Pass  Total  Time
Testing correctness                                           |   88     88  8.4s
  AutoForwardDiff()                                           |   44     44  3.7s
    gradient                                                  |   44     44  3.6s
      Scenario{:gradient,:out} f : Vector{Float32} -> Float32 |   22     22  2.2s
      Scenario{:gradient,:out} f : Matrix{Float64} -> Float64 |   22     22  1.2s
  AutoZygote()                                                |   44     44  4.6s
    gradient                                                  |   44     44  4.6s
      Scenario{:gradient,:out} f : Vector{Float32} -> Float32 |   22     22  3.8s
      Scenario{:gradient,:out} f : Matrix{Float64} -> Float64 |   22     22  0.8s

Benchmarking

Once you are confident that your backends give the correct answers, you probably want to compare their performance. This is made easy by the benchmark_differentiation function, whose syntax should feel familiar:

df = benchmark_differentiation(backends, scenarios);
8×12 DataFrame
Rowbackendscenariooperatorpreparedcallssamplesevalstimeallocsbytesgc_fractioncompile_fraction
Abstract…Scenario…SymbolBoolInt64Int64Int64Float64Float64Float64Float64Float64
1AutoForwardDiff()Scenario{:gradient,:out} f : Vector{Float32} -> Float32value_and_gradienttrue1271515154.93165e-83.0112.00.00.0
2AutoForwardDiff()Scenario{:gradient,:out} f : Vector{Float32} -> Float32gradienttrue1270976434.03079e-82.080.00.00.0
3AutoForwardDiff()Scenario{:gradient,:out} f : Matrix{Float64} -> Float64value_and_gradienttrue1419761091.6122e-73.0160.00.00.0
4AutoForwardDiff()Scenario{:gradient,:out} f : Matrix{Float64} -> Float64gradienttrue1273622779.70397e-82.0128.00.00.0
5AutoZygote()Scenario{:gradient,:out} f : Vector{Float32} -> Float32value_and_gradienttrue127666337.81455e-725.0688.00.00.0
6AutoZygote()Scenario{:gradient,:out} f : Vector{Float32} -> Float32gradienttrue128717435.96465e-723.0624.00.00.0
7AutoZygote()Scenario{:gradient,:out} f : Matrix{Float64} -> Float64value_and_gradienttrue127593261.00146e-629.01040.00.00.0
8AutoZygote()Scenario{:gradient,:out} f : Matrix{Float64} -> Float64gradienttrue127909318.07e-727.0976.00.00.0

The resulting object is a DataFrame from DataFrames.jl, whose columns correspond to the fields of DifferentiationBenchmarkDataRow.