MURAL - Maynooth University Research Archive Library



    Automatic Learning Rate Maximization by On-Line Estimation of the Hessian's Eigenvectors


    Lecun, Yann and Simard, Patrice Y. and Pearlmutter, Barak A. (1993) Automatic Learning Rate Maximization by On-Line Estimation of the Hessian's Eigenvectors. In: Advances in Neural Information Processing Systems 6 : Proceedings of the annual Conference on Advances in Neural Information Processing Systems 1993. Neural Information Processing Systems (NIPS). ISBN 9781558603226

    [img]
    Preview
    Download (152kB) | Preview


    Share your research

    Twitter Facebook LinkedIn GooglePlus Email more...



    Add this article to your Mendeley library


    Abstract

    We propose a very simple, and well principled wayofcomputing the optimal step size in gradient descent algorithms. The on-line version is very efficient computationally, and is applicable to large backpropagation networks trained on large data sets. The main ingredient is a technique for estimating the principal eigenvalue(s) and eigenvector(s) of the objective function's second derivativematrix (Hessian), which does not require to even calculate the Hessian. Several other applications of this technique are proposed for speeding up learning, or for eliminating useless parameters.

    Item Type: Book Section
    Keywords: Automatic Learning; Rate Maximization; On-Line Estimation; Hessian's Eigenvectors;
    Academic Unit: Faculty of Science and Engineering > Computer Science
    Item ID: 8137
    Depositing User: Barak Pearlmutter
    Date Deposited: 07 Apr 2017 15:34
    Publisher: Neural Information Processing Systems (NIPS)
    Refereed: Yes
    URI:
    Use Licence: This item is available under a Creative Commons Attribution Non Commercial Share Alike Licence (CC BY-NC-SA). Details of this licence are available here

    Repository Staff Only(login required)

    View Item Item control page

    Downloads

    Downloads per month over past year

    Origin of downloads