# Problems In no particular order, here are a list of the methods you will find in the notebooks. The emphasis is on understanding their limitations, benefits and constructions. - Least Squares Regression - Random Forests - Boosting, Bagging - Ensemble Methods - Multilayer Perceptrons - Naive Bayes - K-means regression - K-nearest Neighbours Clustering - Logistic Regression - Decision Trees - SVM - Kernel Methods - GAN's - Stable Diffusion - Recurrent Neural Networks - Convolutional Neural Networks - Transformers - word2vec, GLoVE and NLP - LLM To gain proficiency in all of the above methods, I have solved classical problems that lend themselves well to that particular method:
Dataset Accuracy Model
MNIST 92% Logistic Regression
FMNIST B% Random Forest
KMNIST C% 2-layer CNN
CIFAR D% CNN
IRIS E% SVM
ImageNet F% ResNet50
Sentiment140 G% LSTM
Boston Housing H% Linear Regression
Wine Quality I% Gradient Boosting
Pima Indians Diabetes J% Decision Tree
IMDB Reviews K% BERT
KDD Cup 1999 L% K-Means Clustering
Digits M% Gaussian Mixture Model
CartPole N% Deep Q-Network