New technologies in engineering, physics and biomedicine are demanding increasingly complex methods of digital signal processing. By presenting the latest research work the authors demonstrate how real-time recurrent neural networks (RNNs) can be implemented to expand the range of traditional signal processing techniques and to help combat the problem of prediction. Within this text neural networks are considered as massively interconnected nonlinear adaptive filters. Analyses the relationships between RNNs and various nonlinear models and filters, and introduces spatio-temporal architectures together with the concepts of modularity and nestingExamines stability and relaxation within RNNsPresents on-line learning algorithms for nonlinear adaptive filters and introduces new paradigms which exploit the concepts of a priori and a posteriori errors, data-reusing adaptation, and normalisationStudies convergence and stability of on-line learning algorithms based upon optimisation techniques such as contraction mapping and fixed point iterationDescribes strategies for the exploitation of inherent relationships between parameters in RNNsDiscusses practical issues such as predictability and nonlinearity detecting and includes several practical applications in areas such as air pollutant modelling and prediction, attractor discovery and chaos, ECG signal processing, and speech processing Recurrent Neural Networks for Prediction offers a new insight into the learning algorithms, architectures and stability of recurrent neural networks and, consequently, will have instant appeal. It provides an extensive background for researchers, academics and postgraduates enabling them to apply such networks in new applications. VISIT OUR COMMUNICATIONS TECHNOLOGY WEBSITE! http://www.wiley.co.uk/commstech/ VISIT OUR WEB PAGE! http://www.wiley.co.uk/
Les mer
Neural networks consist of interconnected groups of neurons which function as processing units and aim to reconstruct the operation of the human brain.
Preface. Introduction. Fundamentals. Network Architectures for Prediction. Activation Functions Used in Neural Networks. Recurrent Neural Networks Architectures. Neural Networks as Nonlinear Adaptive Filters. Stability Issues in RNN Architectures. Data-Reusing Adaptive Learning Algorithms. A Class of Normalised Algorithms for Online Training of Recurrent Neural Networks. Convergence of Online Learning Algorithms in Neural Networks. Some Practical Considerations of Predictability and Learning Algorithms for Various Signals. Exploiting Inherent Relationships Between Parameters in Recurrent Neural Networks. Appendix A: The O Notation and Vector and Matrix Differentiation. Appendix B: Concepts from the Approximation Theory. Appendix C: Complex Sigmoid Activation Functions, Holomorphic Mappings and Modular Groups. Appendix D: Learning Algorithms for RNNs. Appendix E: Terminology Used in the Field of Neural Networks. Appendix F: On the A Posteriori Approach in Science and Engineering. Appendix G: Contraction Mapping Theorems. Appendix H: Linear GAS Relaxation. Appendix I: The Main Notions in Stability Theory. Appendix J: Deasonsonalising Time Series. References. Index.
Les mer
New technologies in engineering, physics and biomedicine are demanding increasingly complex methods of digital signal processing. By presenting the latest research work the authors demonstrate how real-time recurrent neural networks (RNNs) can be implemented to expand the range of traditional signal processing techniques and to help combat the problem of prediction. Within this text neural networks are considered as massively interconnected nonlinear adaptive filters. Analyses the relationships between RNNs and various nonlinear models and filters, and introduces spatio-temporal architectures together with the concepts of modularity and nestingExamines stability and relaxation within RNNsPresents on-line learning algorithms for nonlinear adaptive filters and introduces new paradigms which exploit the concepts of a priori and a posteriori errors, data-reusing adaptation, and normalisationStudies convergence and stability of on-line learning algorithms based upon optimisation techniques such as contraction mapping and fixed point iterationDescribes strategies for the exploitation of inherent relationships between parameters in RNNsDiscusses practical issues such as predictability and nonlinearity detecting and includes several practical applications in areas such as air pollutant modelling and prediction, attractor discovery and chaos, ECG signal processing, and speech processing Recurrent Neural Networks for Prediction offers a new insight into the learning algorithms, architectures and stability of recurrent neural networks and, consequently, will have instant appeal. It provides an extensive background for researchers, academics and postgraduates enabling them to apply such networks in new applications.
Les mer
Produktdetaljer
ISBN
9780471495178
Publisert
2001-08-06
Utgiver
Vendor
John Wiley & Sons Inc
Vekt
709 gr
Høyde
247 mm
Bredde
174 mm
Dybde
23 mm
Aldersnivå
UU, UP, P, 05, 06
Språk
Product language
Engelsk
Format
Product format
Innbundet
Antall sider
304
Om bidragsyterne
Danilo Mandic from the Imperial College London, London, UK was named Fellow of the Institute of Electrical and Electronics Engineers in 2013 for contributions to multivariate and nonlinear learning systems.
Jonathon A. Chambers is the author of Recurrent Neural Networks for Prediction: Learning Algorithms, Architectures and Stability, published by Wiley.