Bücher Wenner
Gaea Schoeters liest aus TROPHÄE
28.10.2024 um 19:30 Uhr
Connectionist Models
Proceedings of the 1990 Summer School
von David S. Touretzky, Jeffrey L. Elman, Terrence J. Sejnowski
Verlag: Elsevier Science & Techn.
E-Book / PDF
Kopierschutz: PDF mit Wasserzeichen

Hinweis: Nach dem Checkout (Kasse) wird direkt ein Link zum Download bereitgestellt. Der Link kann dann auf PC, Smartphone oder E-Book-Reader ausgeführt werden.
E-Books können per PayPal bezahlt werden. Wenn Sie E-Books per Rechnung bezahlen möchten, kontaktieren Sie uns bitte.

ISBN: 978-1-4832-1448-1
Erschienen am 12.05.2014
Sprache: Englisch
Umfang: 416 Seiten

Preis: 54,95 €

54,95 €
merken
Klappentext
Inhaltsverzeichnis

Connectionist Models contains the proceedings of the 1990 Connectionist Models Summer School held at the University of California at San Diego. The summer school provided a forum for students and faculty to assess the state of the art with regards to connectionist modeling. Topics covered range from theoretical analysis of networks to empirical investigations of learning algorithms; speech and image processing; cognitive psychology; computational neuroscience; and VLSI design.
Comprised of 40 chapters, this book begins with an introduction to mean field, Boltzmann, and Hopfield networks, focusing on deterministic Boltzmann learning in networks with asymmetric connectivity; contrastive Hebbian learning in the continuous Hopfield model; and energy minimization and the satisfiability of propositional logic. Mean field networks that learn to discriminate temporally distorted strings are described. The next sections are devoted to reinforcement learning and genetic learning, along with temporal processing and modularity. Cognitive modeling and symbol processing as well as VLSI implementation are also discussed.
This monograph will be of interest to both students and academicians concerned with connectionist modeling.



?Part I Mean Field, Boltzmann, and Hopfield Networks Deterministic Boltzmann Learning in Networks with Asymmetric Connectivity Contrastive Hebbian Learning in the Continuous Hopfield Model Mean Field Networks that Learn to Discriminate Temporally Distorted Strings Energy Minimization and the Satisfiability of Propositional LogicPart II Reinforcement Learning On the Computational Economics of Reinforcement Learning Reinforcement Comparison Learning Algorithms for Networks with Internal and External FeedbackPart III Genetic Learning Exploring Adaptive Agency I: Theory and Methods for Simulating the Evolution of Learning The Evolution of Learning: An Experiment in Genetic Connectionism Evolving Controls for Unstable SystemsPart IV Temporal Processing Back-Propagation, Weight-Elimination and Time Series Prediction Predicting the Mackey-Glass Timeseries with Cascade-Correlation Learning Learning in Recurrent Finite Difference Networks Temporal Backpropagation: An Efficient Algorithm for Finite Impulse Response Neural NetworksPart V Theory and Analysis Optimal Dimensionality Reduction Using Hebbian Learning Basis-Function Trees for Approximation in High-Dimensional Spaces Effects of Circuit Parameters on Convergence of Trinary Update Back-Propagation Equivalence Proofs for Multi-Layer Perceptron Classifiers and the Bayesian Discriminant Function A Local Approach to Optimal QueriesPart VI Modularity A Modularization Scheme for Feedforward Networks A Compositional Connectionist ArchitecturePart VII Cognitive Modeling and Symbol Processing From Rote Learning to System Building: Acquiring Verb Morphology in Children and Connectionist Nets Parallel Mapping Circuitry in a Phonological Model A Modular Neural Network Model of the Acquisition of Category Names in Children A Computational Model of Attentional Requirements in Sequence Learning Recall of Sequences of Items by a Neural Network Binding, Episodic Short-Term Memory, and Selective Attention, Or Why are PDP Models Poor at Symbol Manipulation? Analogical Retrieval Within a Hybrid Spreading-Activation Network Appropriate Uses of Hybrid Systems Cognitive Map Construction and Use: A Parallel Distributed Processing ApproachPart VIII Speech and Vision Unsupervised Discovery of Speech Segments Using Recurrent Networks Feature Extraction Using an Unsupervised Neural Network Motor Control for Speech Skills: A Connectionist Approach Extracting Features From Faces Using Compression Networks: Face, Identity, Emotion, and Gender Recognition Using Holons The Development of Topography and Ocular Dominance On Modeling Some Aspects of Higher Level VisionPart IX Biology Modeling Cortical Area 7a Using Stochastic Real-Valued (SRV) Units Neuronal Signal Strength is Enhanced by Rhythmic FiringPart X VLSI Implementation An Analog VLSI Neural Network Cocktail Party Processor A VLSI Neural Network with On-Chip LearningIndex