MURAL - Maynooth University Research Archive Library



    Automatic Differentiation in Machine Learning: a Survey


    Baydin, Atilim Gunes and Pearlmutter, Barak A. and Radul, Alexey Andreyevich and Siskind, Jeffrey Mark (2018) Automatic Differentiation in Machine Learning: a Survey. Journal of Marchine Learning Research, 18. pp. 1-43. ISSN 1532-4435

    [img]
    Preview
    Download (624kB) | Preview


    Share your research

    Twitter Facebook LinkedIn GooglePlus Email more...



    Add this article to your Mendeley library


    Abstract

    Derivatives, mostly in the form of gradients and Hessians, are ubiquitous in machine learning. Automatic differentiation (AD), also called algorithmic differentiation or simply “auto-diff”, is a family of techniques similar to but more general than backpropagation for efficiently and accurately evaluating derivatives of numeric functions expressed as computer programs. AD is a small but established field with applications in areas including computational fluid dynamics, atmospheric sciences, and engineering design optimization. Until very recently, the fields of machine learning and AD have largely been unaware of each other and, in some cases, have independently discovered each other’s results. Despite its relevance, general-purpose AD has been missing from the machine learning toolbox, a situation slowly changing with its ongoing adoption under the names “dynamic computational graphs” and “differentiable programming”. We survey the intersection of AD and machine learning, cover applications where AD has direct relevance, and address the main imple- mentation techniques. By precisely defining the main differentiation techniques and their interrelationships, we aim to bring clarity to the usage of the terms “autodiff”, “automatic differentiation”, and “symbolic differentiation” as these are encountered more and more in machine learning settings.

    Item Type: Article
    Additional Information: CC-BY 4.0, see https://creativecommons.org/licenses/by/4.0/. Attribution requirements are provided at http://jmlr.org/papers/v18/17-468.h
    Keywords: Backpropagation; Differentiable Programming;
    Academic Unit: Faculty of Arts,Celtic Studies and Philosophy > Philosophy
    Item ID: 10227
    Depositing User: Barak Pearlmutter
    Date Deposited: 19 Nov 2018 15:39
    Journal or Publication Title: Journal of Marchine Learning Research
    Publisher: Microtome Publishing
    Refereed: Yes
    Funders: Science Foundation Ireland (SFI), Army Research Laboratory, National Science Foundation, Intelligence Advanced Research Projects Activity (IARPA)
    URI:
    Use Licence: This item is available under a Creative Commons Attribution Non Commercial Share Alike Licence (CC BY-NC-SA). Details of this licence are available here

    Repository Staff Only(login required)

    View Item Item control page

    Downloads

    Downloads per month over past year

    Origin of downloads