Tutorial

We present a typical workflow with DifferentiationInterfaceTest.jl, building on the tutorial of the DifferentiationInterface.jl documentation (which we encourage you to read first).

julia> import Chairmarks
julia> using DataFrames
julia> using DifferentiationInterface, DifferentiationInterfaceTest
julia> using ForwardDiff: ForwardDiff
julia> using Zygote: Zygote

Introduction

The AD backends we want to compare are ForwardDiff.jl and Zygote.jl.

backends = [AutoForwardDiff(), AutoZygote()]
2-element Vector{ADTypes.AbstractADType}:
 AutoForwardDiff()
 AutoZygote()

To do that, we are going to take gradients of a simple function:

f(x::AbstractArray) = sum(sin, x)
f (generic function with 1 method)

Of course we know the true gradient mapping:

∇f(x::AbstractArray) = cos.(x)
∇f (generic function with 1 method)

DifferentiationInterfaceTest.jl relies with so-called Scenarios, in which you encapsulate the information needed for your test:

  • the operator category (here :gradient)
  • the behavior of the operator (either :in or :out of place)
  • the function f
  • the input x of the function f (and possible tangents or contexts)
  • the reference first-order result res1 (and possible second-order result res2) of the operator
  • the arguments prep_args passed during preparation
xv = rand(Float32, 3)
xm = rand(Float64, 3, 2)
scenarios = [
    Scenario{:gradient,:out}(f, xv; res1=∇f(xv)),
    Scenario{:gradient,:out}(f, xm; res1=∇f(xm)),
];

Testing

The main entry point for testing is the function test_differentiation. It has many options, but the main ingredients are the following:

julia> test_differentiation(
           backends,  # the backends you want to compare
           scenarios;  # the scenarios you defined,
           correctness=true,  # compares values against the reference
           type_stability=:none,  # checks type stability with JET.jl
           detailed=true,  # prints a detailed test set
       )Test Summary:                                                 | Pass  Total  Time
Testing correctness                                           |   88     88  9.4s
  AutoForwardDiff()                                           |   44     44  4.1s
    gradient                                                  |   44     44  4.1s
      Scenario{:gradient,:out} f : Vector{Float32} -> Float32 |   22     22  2.5s
      Scenario{:gradient,:out} f : Matrix{Float64} -> Float64 |   22     22  1.4s
  AutoZygote()                                                |   44     44  5.2s
    gradient                                                  |   44     44  5.1s
      Scenario{:gradient,:out} f : Vector{Float32} -> Float32 |   22     22  4.2s
      Scenario{:gradient,:out} f : Matrix{Float64} -> Float64 |   22     22  0.9s

Benchmarking

Once you are confident that your backends give the correct answers, you probably want to compare their performance. This is made easy by the benchmark_differentiation function, whose syntax should feel familiar:

table = benchmark_differentiation(backends, scenarios);
DifferentiationInterfaceTest.DifferentiationBenchmark{Float64}(DifferentiationBenchmarkDataRow{Float64}[DifferentiationBenchmarkDataRow{Float64}(AutoForwardDiff(), Scenario{:gradient,:out} f : Vector{Float32} -> Float32, :value_and_gradient, true, 1, 18923, 531, 5.0960451977401134e-8, 3.0, 112.0, 0.0, 0.0), DifferentiationBenchmarkDataRow{Float64}(AutoForwardDiff(), Scenario{:gradient,:out} f : Vector{Float32} -> Float32, :gradient, true, 1, 37066, 484, 4.3716942148760335e-8, 2.0, 80.0, 0.0, 0.0), DifferentiationBenchmarkDataRow{Float64}(AutoForwardDiff(), Scenario{:gradient,:out} f : Matrix{Float64} -> Float64, :value_and_gradient, true, 1, 28790, 229, 1.2451528384279476e-7, 3.0, 160.0, 0.0, 0.0), DifferentiationBenchmarkDataRow{Float64}(AutoForwardDiff(), Scenario{:gradient,:out} f : Matrix{Float64} -> Float64, :gradient, true, 1, 34274, 223, 1.08542600896861e-7, 2.0, 128.0, 0.0, 0.0), DifferentiationBenchmarkDataRow{Float64}(AutoZygote(), Scenario{:gradient,:out} f : Vector{Float32} -> Float32, :value_and_gradient, true, 1, 28303, 35, 8.258285714285715e-7, 25.0, 688.0, 0.0, 0.0), DifferentiationBenchmarkDataRow{Float64}(AutoZygote(), Scenario{:gradient,:out} f : Vector{Float32} -> Float32, :gradient, true, 1, 29441, 46, 6.128913043478261e-7, 23.0, 624.0, 0.0, 0.0), DifferentiationBenchmarkDataRow{Float64}(AutoZygote(), Scenario{:gradient,:out} f : Matrix{Float64} -> Float64, :value_and_gradient, true, 1, 19513, 26, 1.0646923076923077e-6, 29.0, 1040.0, 0.0, 0.0), DifferentiationBenchmarkDataRow{Float64}(AutoZygote(), Scenario{:gradient,:out} f : Matrix{Float64} -> Float64, :gradient, true, 1, 25126, 32, 8.647500000000001e-7, 27.0, 976.0, 0.0, 0.0)])

The resulting object is a table, which can easily be converted into a DataFrame from DataFrames.jl. Its columns correspond to the fields of DifferentiationBenchmarkDataRow.

df = DataFrame(table)
8×12 DataFrame
Rowbackendscenariooperatorpreparedcallssamplesevalstimeallocsbytesgc_fractioncompile_fraction
Abstract…Scenario…SymbolBoolInt64Int64Int64Float64Float64Float64Float64Float64
1AutoForwardDiff()Scenario{:gradient,:out} f : Vector{Float32} -> Float32value_and_gradienttrue1189235315.09605e-83.0112.00.00.0
2AutoForwardDiff()Scenario{:gradient,:out} f : Vector{Float32} -> Float32gradienttrue1370664844.37169e-82.080.00.00.0
3AutoForwardDiff()Scenario{:gradient,:out} f : Matrix{Float64} -> Float64value_and_gradienttrue1287902291.24515e-73.0160.00.00.0
4AutoForwardDiff()Scenario{:gradient,:out} f : Matrix{Float64} -> Float64gradienttrue1342742231.08543e-72.0128.00.00.0
5AutoZygote()Scenario{:gradient,:out} f : Vector{Float32} -> Float32value_and_gradienttrue128303358.25829e-725.0688.00.00.0
6AutoZygote()Scenario{:gradient,:out} f : Vector{Float32} -> Float32gradienttrue129441466.12891e-723.0624.00.00.0
7AutoZygote()Scenario{:gradient,:out} f : Matrix{Float64} -> Float64value_and_gradienttrue119513261.06469e-629.01040.00.00.0
8AutoZygote()Scenario{:gradient,:out} f : Matrix{Float64} -> Float64gradienttrue125126328.6475e-727.0976.00.00.0