Amazon cover image
Image from Amazon.com

Neural nets and chaotic carriers / Peter Whittle.

By: Material type: TextTextSeries: Advances in computer science and engineering. Texts ; ; v. 5.Publication details: London : Imperial College Press ; Hackensack, NJ : Distributed by World Scientific Pub., ©2010.Edition: 2nd edDescription: 1 online resource (xii, 230 pages) : illustrationsContent type:
  • text
Media type:
  • computer
Carrier type:
  • online resource
ISBN:
  • 9781848165915
  • 1848165919
Subject(s): Genre/Form: Additional physical formats: Print version:: Neural nets and chaotic carriers.DDC classification:
  • 006.32 22
LOC classification:
  • QA76.87 .W46 2010eb
Online resources:
Contents:
1. Introduction and aspirations -- 2. Optimal statistical procedures. 2.1. The optimisation of actions. 2.2. Effective estimation of state. 2.3. The quadratic/Gaussian case : estimation and certainty equivalence. 2.4 The linear model, in Bayesian and classic versions -- 3. Linear links and nonlinear knots : The basic neural net. 3.1. Neural calculations : The linear gate and the McCulloch-Pitts net. 3.2. Sigmoid and threshold functions. 3.3. Iteration. 3.4. Neural systems and feedback in continuous time. 3.5. Equilibrium excitation patterns. 3.6. Some special-purpose nets -- 4. Bifurcations and chaos. 4.1. The Hopf bifurcation. 4.2. Chaos -- 5. What is a memory? The Hamming and Hopfield nets. 5.1. Associative memories. 5.2. The Hamming net. 5.3. Autoassociation, feedback and storage. 5.4. The Hopfield net. 5.5. Alternative formulations of the Hopfield net -- 6. Compound and 'spurious' traces. 6.1. Performance and trace structure. 6.2. The recognition of simple traces. 6.3. Inference for compound traces. 6.4. Network realisation of the quantised regression. 6.5. Reliability constraints for the quantised regression. 6.6. Stability constraints for the quantised regression. 6.7. The Hopfield net -- 7. Preserving plasticity : A Bayesian approach. 7.1. A Bayesian view. 7.2. A robust estimation method. 7.3. Dynamic and neural versions of the algorithm -- 8. The key task : the fixing of fading data. Conclusions I. 8.1. Fading data, and the need for quantisation. 8.2. The probability-maximising algorithm (PMA). 8.3. Properties of the vector activation function F(z). 8.4. Some special cases. 8.5. The network realisation of the full PMA. 8.6. Neural implementation of the PMA. 8.7. The PMA and the exponential family. 8.8. Conclusions I -- 9. Performance of the probability-maximising algorithm. 9.1. A general formulation. 9.2. Considerations for reliable inference. 9.3. Performance of the PMA for simple stimuli. 9.4. Compound stimuli : The general pattern. 9. 5. Compound stimuli in the Gaussian case -- 10. Other memories -- other considerations. 10.1. The supervised learning of a linear relation. 10.2. Unsupervised learning : The criterion of economy.
Summary: Neural Nets and Chaotic Carriers develops rational principles for the design of associative memories, with a view to applying these principles to models with irregularly oscillatory operation so evident in biological neural systems, and necessitated by the meaninglessness of absolute signal levels. Design is based on the criterion that an associative memory must be able to cope with "fading data", i.e., to form an inference from the data even as its memory of that data degrades. The resultant net shows striking biological parallels. When these principles are combined with the Freeman specification of a neural oscillator, some remarkable effects emerge. For example, the commonly-observed phenomenon of neuronal bursting appears, with gamma-range oscillation modulated by a low-frequency square-wave oscillation (the "escapement oscillation"). Bridging studies and new results of artificial and biological neural networks, the book has a strong research character. It is, on the other hand, accessible to non-specialists for its concise exposition on the basics
Item type:
Tags from this library: No tags from this library for this title. Log in to add tags.
Star ratings
    Average rating: 0.0 (0 votes)
Holdings
Item type Home library Collection Call number Materials specified Status Date due Barcode
Electronic-Books Electronic-Books OPJGU Sonepat- Campus E-Books EBSCO Available

Includes bibliographical references and index.

1. Introduction and aspirations -- 2. Optimal statistical procedures. 2.1. The optimisation of actions. 2.2. Effective estimation of state. 2.3. The quadratic/Gaussian case : estimation and certainty equivalence. 2.4 The linear model, in Bayesian and classic versions -- 3. Linear links and nonlinear knots : The basic neural net. 3.1. Neural calculations : The linear gate and the McCulloch-Pitts net. 3.2. Sigmoid and threshold functions. 3.3. Iteration. 3.4. Neural systems and feedback in continuous time. 3.5. Equilibrium excitation patterns. 3.6. Some special-purpose nets -- 4. Bifurcations and chaos. 4.1. The Hopf bifurcation. 4.2. Chaos -- 5. What is a memory? The Hamming and Hopfield nets. 5.1. Associative memories. 5.2. The Hamming net. 5.3. Autoassociation, feedback and storage. 5.4. The Hopfield net. 5.5. Alternative formulations of the Hopfield net -- 6. Compound and 'spurious' traces. 6.1. Performance and trace structure. 6.2. The recognition of simple traces. 6.3. Inference for compound traces. 6.4. Network realisation of the quantised regression. 6.5. Reliability constraints for the quantised regression. 6.6. Stability constraints for the quantised regression. 6.7. The Hopfield net -- 7. Preserving plasticity : A Bayesian approach. 7.1. A Bayesian view. 7.2. A robust estimation method. 7.3. Dynamic and neural versions of the algorithm -- 8. The key task : the fixing of fading data. Conclusions I. 8.1. Fading data, and the need for quantisation. 8.2. The probability-maximising algorithm (PMA). 8.3. Properties of the vector activation function F(z). 8.4. Some special cases. 8.5. The network realisation of the full PMA. 8.6. Neural implementation of the PMA. 8.7. The PMA and the exponential family. 8.8. Conclusions I -- 9. Performance of the probability-maximising algorithm. 9.1. A general formulation. 9.2. Considerations for reliable inference. 9.3. Performance of the PMA for simple stimuli. 9.4. Compound stimuli : The general pattern. 9. 5. Compound stimuli in the Gaussian case -- 10. Other memories -- other considerations. 10.1. The supervised learning of a linear relation. 10.2. Unsupervised learning : The criterion of economy.

Neural Nets and Chaotic Carriers develops rational principles for the design of associative memories, with a view to applying these principles to models with irregularly oscillatory operation so evident in biological neural systems, and necessitated by the meaninglessness of absolute signal levels. Design is based on the criterion that an associative memory must be able to cope with "fading data", i.e., to form an inference from the data even as its memory of that data degrades. The resultant net shows striking biological parallels. When these principles are combined with the Freeman specification of a neural oscillator, some remarkable effects emerge. For example, the commonly-observed phenomenon of neuronal bursting appears, with gamma-range oscillation modulated by a low-frequency square-wave oscillation (the "escapement oscillation"). Bridging studies and new results of artificial and biological neural networks, the book has a strong research character. It is, on the other hand, accessible to non-specialists for its concise exposition on the basics

Print version record.

eBooks on EBSCOhost EBSCO eBook Subscription Academic Collection - Worldwide

There are no comments on this title.

to post a comment.

O.P. Jindal Global University, Sonepat-Narela Road, Sonepat, Haryana (India) - 131001

Send your feedback to glus@jgu.edu.in

Hosted, Implemented & Customized by: BestBookBuddies   |   Maintained by: Global Library