Tutorial

We present a typical workflow with DifferentiationInterfaceTest.jl, building on the tutorial of the DifferentiationInterface.jl documentation (which we encourage you to read first).

julia> using DifferentiationInterface, DifferentiationInterfaceTest
julia> import ForwardDiff, Zygote

Introduction

The AD backends we want to compare are ForwardDiff.jl and Zygote.jl.

backends = [AutoForwardDiff(), AutoZygote()]
2-element Vector{ADTypes.AbstractADType}:
 AutoForwardDiff()
 AutoZygote()

To do that, we are going to take gradients of a simple function:

f(x::AbstractArray) = sum(sin, x)
f (generic function with 1 method)

Of course we know the true gradient mapping:

∇f(x::AbstractArray) = cos.(x)
∇f (generic function with 1 method)

DifferentiationInterfaceTest.jl relies with so-called "scenarios", in which you encapsulate the information needed for your test:

  • the operator category (:gradient)
  • the behavior of the operator (either :in or :out of place)
  • the function f
  • the input x of the function f
  • the reference first-order result res1 of the operator
xv = rand(Float32, 3)
xm = rand(Float64, 3, 2)
scenarios = [
    Scenario{:gradient,:out}(f, xv; res1=∇f(xv)),
    Scenario{:gradient,:out}(f, xm; res1=∇f(xm))
];

Testing

The main entry point for testing is the function test_differentiation. It has many options, but the main ingredients are the following:

julia> test_differentiation(
           backends,  # the backends you want to compare
           scenarios,  # the scenarios you defined,
           correctness=true,  # compares values against the reference
           type_stability=:none,  # checks type stability with JET.jl
           detailed=true,  # prints a detailed test set
       )Test Summary:                                                 | Pass  Total  Time
Testing correctness                                           |   88     88  9.0s
  AutoForwardDiff()                                           |   44     44  5.6s
    gradient                                                  |   44     44  5.6s
      Scenario{:gradient,:out} f : Vector{Float32} -> Float32 |   22     22  3.4s
      Scenario{:gradient,:out} f : Matrix{Float64} -> Float64 |   22     22  2.1s
  AutoZygote()                                                |   44     44  3.3s
    gradient                                                  |   44     44  3.3s
      Scenario{:gradient,:out} f : Vector{Float32} -> Float32 |   22     22  2.6s
      Scenario{:gradient,:out} f : Matrix{Float64} -> Float64 |   22     22  0.7s

Benchmarking

Once you are confident that your backends give the correct answers, you probably want to compare their performance. This is made easy by the benchmark_differentiation function, whose syntax should feel familiar:

df = benchmark_differentiation(backends, scenarios);
8×12 DataFrame
Rowbackendscenariooperatorpreparedcallssamplesevalstimeallocsbytesgc_fractioncompile_fraction
Abstract…Scenario…SymbolBoolInt64Int64Int64Float64Float64Float64Float64Float64
1AutoForwardDiff()Scenario{:gradient,:out} f : Vector{Float32} -> Float32value_and_gradienttrue1527331995.47739e-83.0112.00.00.0
2AutoForwardDiff()Scenario{:gradient,:out} f : Vector{Float32} -> Float32gradienttrue1292456194.51567e-82.080.00.00.0
3AutoForwardDiff()Scenario{:gradient,:out} f : Matrix{Float64} -> Float64value_and_gradienttrue1284712051.3982e-74.0192.00.00.0
4AutoForwardDiff()Scenario{:gradient,:out} f : Matrix{Float64} -> Float64gradienttrue1331811971.26426e-73.0160.00.00.0
5AutoZygote()Scenario{:gradient,:out} f : Vector{Float32} -> Float32value_and_gradienttrue129506348.25941e-724.0672.00.00.0
6AutoZygote()Scenario{:gradient,:out} f : Vector{Float32} -> Float32gradienttrue130013456.26044e-722.0608.00.00.0
7AutoZygote()Scenario{:gradient,:out} f : Matrix{Float64} -> Float64value_and_gradienttrue129449833.37976e-710.0464.00.00.0
8AutoZygote()Scenario{:gradient,:out} f : Matrix{Float64} -> Float64gradienttrue129181873.23943e-710.0464.00.00.0

The resulting object is a DataFrame from DataFrames.jl, whose columns correspond to the fields of DifferentiationBenchmarkDataRow: