Automatic differentiation in machine learning: a survey


Baydin, Atilim Gunes and Pearlmutter, Barak A. and Radul, Alexey Andreyevich and Siskind, Jeffrey Mark (2015) Automatic differentiation in machine learning: a survey. Working Paper. arXiv.

[img]
Preview
Download (660kB) | Preview


Share your research

Twitter Facebook LinkedIn GooglePlus Email more...



Add this article to your Mendeley library


Abstract

Derivatives, mostly in the form of gradients and Hessians, are ubiquitous in machine learning. Automatic differentiation (AD) is a technique for calculating derivatives of numeric functions expressed as computer programs efficiently and accurately, used in fields such as computational fluid dynamics, nuclear engineering, and atmospheric sciences. Despite its advantages and use in other fields, machine learning practitioners have been little influenced by AD and make scant use of available tools. We survey the intersection of AD and machine learning, cover applications where AD has the potential to make a big impact, and report on some recent developments in the adoption of this technique. We aim to dispel some misconceptions that we contend have impeded the use of AD within the machine learning community.

Item Type: Monograph (Working Paper)
Keywords: Optimization; Gradient methods; Backpropagation;
Academic Unit: Faculty of Science and Engineering > Computer Science
Item ID: 6275
Identification Number: arXiv:1502.05767
Depositing User: Barak Pearlmutter
Date Deposited: 21 Jul 2015 14:43
Publisher: arXiv
URI:

    Repository Staff Only(login required)

    View Item Item control page

    Document Downloads

    More statistics for this item...