Skip to Content

Backpropagation

Theory, Architectures, and Applications

Edited by Yves Chauvin, David E. Rumelhart

Psychology Press – 1995 – 576 pages

Series: Developments in Connectionist Theory Series

Purchasing Options:

  • Add to CartPaperback: $98.95
    978-0-8058-1259-6
    February 1st 1995
  • Add to CartHardback: $180.00
    978-0-8058-1258-9
    February 1st 1995

Description

Composed of three sections, this book presents the most popular training algorithm for neural networks: backpropagation. The first section presents the theory and principles behind backpropagation as seen from different perspectives such as statistics, machine learning, and dynamical systems. The second presents a number of network architectures that may be designed to match the general concepts of Parallel Distributed Processing with backpropagation learning. Finally, the third section shows how these principles can be applied to a number of different fields related to the cognitive sciences, including control, speech recognition, robotics, image processing, and cognitive psychology. The volume is designed to provide both a solid theoretical foundation and a set of examples that show the versatility of the concepts. Useful to experts in the field, it should also be most helpful to students seeking to understand the basic principles of connectionist learning and to engineers wanting to add neural networks in general -- and backpropagation in particular -- to their set of problem-solving methods.

Contents

Contents: D.E. Rumelhart, R. Durbin, R. Golden, Y. Chauvin, Backpropagation: The Basic Theory. A. Waibel, T. Hanazawa, G. Hinton, K. Shikano, K.J. Lang, Phoneme Recognition Using Time-Delay Neural Networks. C. Schley, Y. Chauvin, V. Henkle, Automated Aircraft Flare and Touchdown Control Using Neural Networks. F.J. Pineda, Recurrent Backpropagation Networks. M.C. Mozer, A Focused Backpropagation Algorithm for Temporal Pattern Recognition. D.H. Nguyen, B. Widrow, Nonlinear Control with Neural Networks. M.I. Jordan, D.E. Rumelhart, Forward Models: Supervised Learning with a Distal Teacher. S.J. Hanson, Backpropagation: Some Comments and Variations. A. Cleeremans, D. Servan-Schreiber, J.L. McClelland, Graded State Machines: The Representation of Temporal Contingencies in Feedback Networks. S. Becker, G.E. Hinton, Spatial Coherence as an Internal Teacher for a Neural Network. J.R. Bachrach, M.C. Mozer, Connectionist Modeling and Control of Finite State Systems Given Partial State Information. P. Baldi, Y. Chauvin, K. Hornik, Backpropagation and Unsupervised Learning in Linear Networks. R.J. Williams, D. Zipser, Gradient-Based Learning Algorithms for Recurrent Networks and Their Computational Complexity. P. Baldi, Y. Chauvin, When Neural Networks Play Sherlock Homes. P. Baldi, Gradient Descent Learning Algorithms: A Unified Perspective.

Name: Backpropagation: Theory, Architectures, and Applications (Paperback)Psychology Press 
Description: Edited by Yves Chauvin, David E. Rumelhart. Composed of three sections, this book presents the most popular training algorithm for neural networks: backpropagation. The first section presents the theory and principles behind backpropagation as seen from different perspectives such as statistics,...
Categories: Cognitive Science, Connectionism/Neural Nets