Skip to content

JuliaDiff/DifferentiationInterface.jl

Folders and files

NameName
Last commit message
Last commit date
Nov 7, 2024
Feb 10, 2025
May 17, 2025
May 12, 2025
Mar 22, 2024
Jan 21, 2025
May 12, 2025
Oct 28, 2024
Jan 9, 2024
Apr 4, 2024

Repository files navigation

DifferentiationInterface Logo

DifferentiationInterface

Build Status Coverage Code Style: Blue ColPrac: Contributor's Guide on Collaborative Practices for Community Packages DOI

Package Docs
DifferentiationInterface Stable Dev
DifferentiationInterfaceTest Stable Dev

An interface to various automatic differentiation (AD) backends in Julia.

Goal

This package provides a unified syntax to differentiate functions, including:

  • First- and second-order operators (gradients, Jacobians, Hessians and more)
  • In-place and out-of-place differentiation
  • Preparation mechanism (e.g. to pre-allocate a cache or record a tape)
  • Built-in sparsity handling
  • Thorough validation on standard inputs and outputs (numbers, vectors, matrices)
  • Testing and benchmarking utilities accessible to users with DifferentiationInterfaceTest

Compatibility

We support the following backends defined by ADTypes.jl:

Note that in some cases, going through DifferentiationInterface.jl might be slower than a direct call to the backend's API. This is mostly true for Enzyme.jl, whose handling of activities and multiple arguments unlocks additional performance. We are working on this challenge, and welcome any suggestions or contributions. Meanwhile, if differentiation fails or takes too long, consider using Enzyme.jl directly.

Installation

To install the stable version of the package, run the following code in a Julia REPL:

using Pkg

Pkg.add("DifferentiationInterface")

To install the development version, run this instead:

using Pkg

Pkg.add(
    url="https://github.com/JuliaDiff/DifferentiationInterface.jl",
    subdir="DifferentiationInterface"
)

Example

using DifferentiationInterface
import ForwardDiff, Enzyme, Zygote  # AD backends you want to use 

f(x) = sum(abs2, x)

x = [1.0, 2.0]

value_and_gradient(f, AutoForwardDiff(), x) # returns (5.0, [2.0, 4.0]) with ForwardDiff.jl
value_and_gradient(f, AutoEnzyme(),      x) # returns (5.0, [2.0, 4.0]) with Enzyme.jl
value_and_gradient(f, AutoZygote(),      x) # returns (5.0, [2.0, 4.0]) with Zygote.jl

To improve your performance by up to several orders of magnitude compared to this example, take a look at the tutorial and its section on operator preparation.

Citation

Whenever you refer to this package or the ideas it contains, please cite:

  1. our preprint A Common Interface for Automatic Differentiation;
  2. our inspiration AbstractDifferentiation.jl.

You can use the provided CITATION.cff file or the following BibTeX entries:

@misc{dalle2025commoninterfaceautomaticdifferentiation,
      title={A Common Interface for Automatic Differentiation}, 
      author={Guillaume Dalle and Adrian Hill},
      year={2025},
      eprint={2505.05542},
      archivePrefix={arXiv},
      primaryClass={cs.MS},
      url={https://arxiv.org/abs/2505.05542}, 
}

@misc{schäfer2022abstractdifferentiationjlbackendagnosticdifferentiableprogramming,
      title={AbstractDifferentiation.jl: Backend-Agnostic Differentiable Programming in Julia}, 
      author={Frank Schäfer and Mohamed Tarek and Lyndon White and Chris Rackauckas},
      year={2022},
      eprint={2109.12449},
      archivePrefix={arXiv},
      primaryClass={cs.MS},
      url={https://arxiv.org/abs/2109.12449}, 
}

If you use the software, additionally cite us using the precise Zenodo DOI of the package version you used, or the BibTeX entry below:

@software{dalleDifferentiationInterface2025,
      author={Dalle, Guillaume and Hill, Adrian},
      title={Differentiation{I}nterface.jl},
      year={2024},
      publisher={Zenodo},
      doi={10.5281/zenodo.11092033},
      url={https://doi.org/10.5281/zenodo.11092033},
}