ML (Stanford University)- Coursera
- Linear Regression: Cost function, gradient descent for one variable and multi variables, feature normalization
- Normal equations
- Logistic Regression: Sigmoid function, cost function and gradient, learning parameters using fminunc, regulariztion.
- Multi-class CLassification: Regularized logistic regression with cost function and gradient, one-vs-all classification, one-vs-all prediction.
- Neural Networks: Feedforward propagation and prediction.
- Neural Networks Learning (Hand-written digit recognition): Feedforward and regularized cost function, backpropagation including sigmoid gradient and random initialization, gradient checking, learning parameters using fmincg, visualizing the hidden layer.
- Regularized Linear Regression: Regularized linear regression cost function and gradient.
- Bias v.s. Variance: Learning curves.
- Polynomial regression: Learning polynomial regression, selecting λ using a cross validation set, computing test set error and plotting learning curves with randomly selected examples.
- Support Vector Machines: Linear classification, non-linear clasification using Gaussian Kernel.
- Spam Classification: Preprocessing emails, extracting features from emails using vocabulary list, training SVM for Spam Classification and predicting emails as spam or non-spam.
- K-means Clustering: Finding closest centroids, computing centroid means, random initialization.
- Image Compression with K-means: K-means on pixels.
- Principal Component Analysis: Implementing PCA.
- Dimensionality Reduction with PCA: Projecting the data onto the principal components, reconstructing an approximation of the data, run PCA on Face Image Dataset and reduces dimensions.
- Anomaly Detection: Estimating parameters for a Gaussian distribution, selecting the threshold 'ε' using cross-validation dataset.
- Recommender Systems (Movie Rating): Collaborative filtering learning algorithm- cost function and gradient with regularization, learning movie recommendations using fmincg.