ForwardDiff.jl¶
ForwardDiff implements methods to take derivatives, gradients, Jacobians, Hessians, and higher-order derivatives of native Julia functions (or any callable object, really) using forward mode automatic differentiation (AD).
While performance can vary depending on the functions you evaluate, the algorithms implemented by ForwardDiff generally outperform non-AD algorithms in both speed and accuracy.
This wikipedia page on automatic differentiation is a useful resource for learning about the advantages of AD techniques over other common differentiation methods (such as finite differencing).
User Documentation¶
- Installation and Version Requirements
- Limitations of ForwardDiff
- Basic ForwardDiff API
- Derivatives of \(f(x) : \mathbb{R} \to \mathbb{R}^{n_1} \times \dots \times \mathbb{R}^{n_k}\)
- Gradients of \(f(x) : \mathbb{R}^{n_1} \times \dots \times \mathbb{R}^{n_k} \to \mathbb{R}\)
- Jacobians of \(f(x) : \mathbb{R}^{n_1} \times \dots \times \mathbb{R}^{n_k} \to \mathbb{R}^{m_1} \times \dots \times \mathbb{R}^{m_k}\)
- Hessians of \(f(x) : \mathbb{R}^{n_1} \times \dots \times \mathbb{R}^{n_k} \to \mathbb{R}\)
- The
AbstractConfig
Types
- Advanced Usage Guide
- Upgrading from Older Versions of ForwardDiff
Developer Documentation¶
Publications¶
If you find ForwardDiff useful in your work, we kindly request that you cite the following paper:
@article{RevelsLubinPapamarkou2016,
title = {Forward-Mode Automatic Differentiation in Julia},
author = {{Revels}, J. and {Lubin}, M. and {Papamarkou}, T.},
journal = {arXiv:1607.07892 [cs.MS]},
year = {2016},
url = {https://arxiv.org/abs/1607.07892}
}