Authors: Kenza Bouzid, Agnieszka Miszkurka, Tobias Höppe
Solutions for labs for ANN course at KTH. Each lab contains implementation of neural networks algorithms as well as notebooks with experiments.
- perceptron learning, delta rule
- Multi Layer Perceptron with generalised delta rule (backprop)
- Multi-layer perceptron network for chaotic time-series prediction
- Function approximation with RBF Network
- Radial Basis Function trained with competitive learing
- Self-organising maps used for data cluserisation and visualisation
- Hopfield network with Hebbian learnign
- Experiments concerning various properties of hopfield networks such as:
- capacity
- convergence
- distortion resistance
- energy
- sequential vs batch update
- sparsity of patterns
- RBM trained with contrastive divergence
- DBN trained using greed layer-wise pretraining of RBMs
- Using RBM and DBN for classification and generating new samples with MNIST data set