Oct 29, 2003 ------------- - More on neural networks - Why perceptron converges, but not multi-layer network? - Ans: perceptron weight space has only one minimum (w.r.t. error) - Whereas: neural network weight space has multiple local minima - printout - Error surfaces for neural networks - why does perceptron converge? => single global minimum, no local minimum - what about multi-layer networks? => multiple local minima => nasty looking surface - How to make neural networks work? - lots of Engineering hacks - Choosing number of hidden layers - watch out: some people use "layers of nodes", some use "layers of weights" - at most two layers of nodes is enough - somtimes can get away with one - Choosing number of nodes in each layer - input: as many as the problem states - output: choice of - local encoding - distributed encoding - more weights => easier to learn, but - more weights => more overfitting - hidden: can be - smaller, or - greater - Other hacks - use 0.1 and 0.9 to denote 0 and 1 - initialize weights to small random values, but ensure that they are *different* - Real NN applications - autonomous driving of truck: ALVINN - learning to recognize face pose