Connectionism: A Hands-on ApproachJohn Wiley & Sons, 15.4.2008 - 208 sivua Connectionism is a "hands on" introduction to connectionist modeling through practical exercises in different types of connectionist architectures.
|
Sisältö
1 | |
Chapter 2The Distributed Associative Memory | 5 |
Chapter 3The James Program | 9 |
Chapter 4Introducing Hebb Learning | 22 |
Chapter 5Limitations of Hebb Learning | 30 |
Chapter 6Introducing the Delta Rule | 37 |
Chapter 7Distributed Networks and Human Memory | 41 |
Chapter 8Limitations of Delta Rule Learning | 46 |
Chapter 17The Multilayer Perceptron | 108 |
Chapter 18The Rumelhart Program | 114 |
Chapter 19Beyond the Perceptron s Limits | 129 |
Chapter 20Symmetry as a Second Case Study | 133 |
Chapter 21How Many Hidden Units? | 137 |
Chapter 22Scaling Up With the Parity Problem | 145 |
Chapter 23Selectionism and Parity | 151 |
Chapter 24Interpreting a Small Network | 157 |
Chapter 9The Perceptron | 48 |
Chapter 10The Rosenblatt Program | 58 |
Chapter 11Perceptrons and Logic Gates | 72 |
Chapter 12Performing More Logic With Perceptrons | 81 |
Chapter 13Value Units and Linear Nonseparability | 86 |
Chapter 14Network By Problem Type Interactions | 91 |
Chapter 15Perceptrons and Generalization | 94 |
Chapter 16Animal Learning Theory and Perceptrons | 99 |
Chapter 25Interpreting Networks of Value Units | 163 |
Chapter 26Interpreting Distributed Representations | 174 |
Chapter 27Creating Your Own Training Sets | 183 |
188 | |
195 | |
198 | |
Muita painoksia - Näytä kaikki
Yleiset termit ja lausekkeet
9maj.net activation function architecture artificial neural networks band basis vectors chapter choose cluster analysis cognitive science connection weights connectionism converge created Dawson Default starts define a hit delta rule distributed associative memory epochs between printouts examine exclusive-or exercise explore figure Gaussian function gradient descent rule Hebb rule hidden units input patterns input units integration devices interpretation James program jittered density plot learning rate learning rule left-click linearly independent load the file logic gates logical operations logistic function maximum number mouse multilayer perceptron neuron number of epochs number of hidden number of sweeps number of training option output unit parity problem pattern space performance procedure represent response Rosenblatt program Rumelhart program Schopflocher selected simulation solve spreadsheet squared error starts for thresholds starts for weights stimuli Test Recall tool trained network training epochs training set training the network two-valued algebra unit’s value units