Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/beegass/self-taught-machine-learning

I have had trouble in the past finding a place where I could learn about statistical learning algorithms, resources as to how to learn them and the code associated with it. This is my attempt at remedying that issue.
https://github.com/beegass/self-taught-machine-learning

lecture linear-regression machine-learning statistical-learning support-vector-machine

Last synced: about 2 months ago
JSON representation

I have had trouble in the past finding a place where I could learn about statistical learning algorithms, resources as to how to learn them and the code associated with it. This is my attempt at remedying that issue.

Awesome Lists containing this project

README

        

# Self Taught Machine Learning

[From Knowing Nothing To Being An AI Expert: Roadmap](https://i.am.ai/roadmap)

## CURRICULUM 1: [Cornell's](https://www.youtube.com/playlist?list=PLl8OlHZGYOQ7bkVbuRthEsaLr7bONzbXS)
1. [Machine Learning Setup](http://www.cs.cornell.edu/courses/cs4780/2018fa/lectures/lecturenote01_MLsetup.html)
2. [k-Nearest Neighbors / Curse of Dimensionality](http://www.cs.cornell.edu/courses/cs4780/2018fa/lectures/lecturenote02_kNN.html)
#### 2.1 KNN Resources
* [Great explanation of nearest neighbor](https://alliance.seas.upenn.edu/~cis520/dynamic/2016/wiki/index.php?n=Lectures.LocalLearning)
* [Video Of KNN](https://www.youtube.com/watch?v=gdS0V35GqgQ)
* [Video explanation of K-nearest neighbor classification](http://videolectures.net/aaai07_bosch_knnc/)

Great explanations of the algorithm
* [How kNN algorithm works](https://www.youtube.com/watch?v=UqYde-LULfs)
* [k-Nearest Neighbor classification algorithm](https://www.youtube.com/watch?v=4ObVzTuFivY)

#### 2.2 PCA Resources
I Learned About [PCA's](https://en.wikipedia.org/wiki/Principal_component_analysis) (Principal Component Analysis) To Get My KNN To Work, This Is What I Used
* [Start With This](https://www.youtube.com/watch?v=FgakZw6K1QQ)
* [Incredible Description Of Everything You Will Need](https://drscotthawley.github.io/blog/2019/12/21/PCA-From-Scratch.html)

Other Resoucres To Check Out For PCA:
* [Machine Learning Mastery's PCA From Scratch](https://machinelearningmastery.com/calculate-principal-component-analysis-scratch-python/)
* [Another Youtube Explanation of PCA](https://www.youtube.com/watch?v=g-Hb26agBFg)
* [Implementing PCA In Pratice](https://sebastianraschka.com/Articles/2014_pca_step_by_step.html)

3. [Perceptron](http://www.cs.cornell.edu/courses/cs4780/2018fa/lectures/lecturenote03.html)
4. [Estimating Probabilities from data](http://www.cs.cornell.edu/courses/cs4780/2018fa/lectures/lecturenote04.html)
5. [Bayes Classifier and Naive Bayes](http://www.cs.cornell.edu/courses/cs4780/2018fa/lectures/lecturenote05.html)
6. [Logistic Regression / Maximum Likelihood Estimation / Maximum a Posteriori](http://www.cs.cornell.edu/courses/cs4780/2018fa/lectures/lecturenote06.html)
7. [Gradient Descent](http://www.cs.cornell.edu/courses/cs4780/2018fa/lectures/lecturenote07.html)
8. [Linear Regression](http://www.cs.cornell.edu/courses/cs4780/2018fa/lectures/lecturenote08.html)
9. [Support Vector Machine](http://www.cs.cornell.edu/courses/cs4780/2018fa/lectures/lecturenote09.html)
10. [Empirical Risk Minimization](http://www.cs.cornell.edu/courses/cs4780/2018fa/lectures/lecturenote10.html)
11. [Model Selection](http://www.cs.cornell.edu/courses/cs4780/2018fa/lectures/lecturenote11.html)
12. [Bias-Variance Tradeoff](http://www.cs.cornell.edu/courses/cs4780/2018fa/lectures/lecturenote12.html)
13. [Kernels](http://www.cs.cornell.edu/courses/cs4780/2018fa/lectures/lecturenote13.html)
14. [Kernels continued](http://www.cs.cornell.edu/courses/cs4780/2018fa/lectures/lecturenote14.html)
15. [Gaussian Processes](http://www.cs.cornell.edu/courses/cs4780/2018fa/lectures/lecturenote15.html)
16. [k-Dimensional Trees](http://www.cs.cornell.edu/courses/cs4780/2018fa/lectures/lecturenote16.html)
17. [Decision Trees](http://www.cs.cornell.edu/courses/cs4780/2018fa/lectures/lecturenote17.html)
18. [Bagging](http://www.cs.cornell.edu/courses/cs4780/2018fa/lectures/lecturenote18.html)
19. [Boosting](http://www.cs.cornell.edu/courses/cs4780/2018fa/lectures/lecturenote19.html)
20. [Neural Networks](http://www.cs.cornell.edu/courses/cs4780/2018fa/lectures/lecturenote20.html)
21. [Deep Learning / Stochastic Gradient Descent](http://www.cs.cornell.edu/courses/cs4780/2018fa/lectures/lecturenote20.pdf)

* Some Things That Might Help:
* [Overview and Associated Notes For Each Chapter](http://www.cs.cornell.edu/courses/cs4780/2018fa/page18/)
* [Each Section In Order](http://www.cs.cornell.edu/courses/cs4780/2018fa/lectures/)
* [YouTube Version Of The Lecture](https://www.youtube.com/playlist?list=PLl8OlHZGYOQ7bkVbuRthEsaLr7bONzbXS)
* [Machine Learning: A Probabilistic Perspective](http://noiselab.ucsd.edu/ECE228/Murphy_Machine_Learning.pdf)
* [The Elements of Statistical Learning Data Mining, Inference, and Prediction](https://web.stanford.edu/~hastie/Papers/ESLII.pdf)

## CURRICULUM 2: [Andrew Ng's Machine Learning Coursera Course](https://www.coursera.org/learn/machine-learning)
Week 1: Introduction, Linear Regression With One Variable, Linear Algebra Review

Week 2: Linear Regression With Multiple Variables, Octave/Matlab Tutorial

Week 3: Logistic Regression, Regularization

Week 4: Neural Networks: Representation

Week 5: Neural Networks: Learning

Week 6: Advice For Applying Machine Learning, Machine Learning System Design

Week 7: Support Vector Machine

Week 8: Unsupervised Learning, Dimensionality Reduction

Week 9: Anomaly Detection, Recommender Systems

Week 10: Large Scale Machine Learning

Week 11: Application Example: Photo OCR

* This may help you
* [Answers To Everything In Python](https://github.com/dibgerge/ml-coursera-python-assignments)

## CURRICULUM 3: [Yet Another Machine Learning Course](https://users.wpi.edu/~jrwhitehill/CS453X_2018_Lectures/)
1. Lecture (introduction to ML, accuracy & loss functions): [PDF](https://users.wpi.edu/~jrwhitehill/CS453X_2018_Lectures/CS453X_2018_Lecture1.pdf)
2. Lecture (greedy step-wise classification, training versus testing): [PDF](https://users.wpi.edu/~jrwhitehill/CS453X_2018_Lectures/CS453X_2018_Lecture2.pdf)
3. Lecture (linear regression): [PDF](https://users.wpi.edu/~jrwhitehill/CS453X_2018_Lectures/CS453X_2018_Lecture3.pdf)
4. Lecture (more on linear regression): [PDF](https://users.wpi.edu/~jrwhitehill/CS453X_2018_Lectures/CS453X_2018_Lecture4.pdf)
5. Lecture (gradient descent): [PDF](https://users.wpi.edu/~jrwhitehill/CS453X_2018_Lectures/CS453X_2018_Lecture5.pdf)
6. Lecture (polynomial regression, overfitting): [PDF](https://users.wpi.edu/~jrwhitehill/CS453X_2018_Lectures/CS453X_2018_Lecture6.pdf)
7. Lecture (regularization, logistic regression): [PDF](https://users.wpi.edu/~jrwhitehill/CS453X_2018_Lectures/CS453X_2018_Lecture7.pdf)
8. Lecture (softmax regression, cross-entropy): [PDF](https://users.wpi.edu/~jrwhitehill/CS453X_2018_Lectures/CS453X_2018_Lecture8.pdf)
9. Lecture (stochastic gradient descent, convexity): [PDF](https://users.wpi.edu/~jrwhitehill/CS453X_2018_Lectures/CS453X_2018_Lecture9.pdf)
10. Lecture (positive semi-definiteness, constrained optimization): [PDF](https://users.wpi.edu/~jrwhitehill/CS453X_2018_Lectures/CS453X_2018_Lecture10.pdf)
11. Lecture (support vector machines): [PDF](https://users.wpi.edu/~jrwhitehill/CS453X_2018_Lectures/CS453X_2018_Lecture11.pdf)
12. Lecture (soft versus hard margin SVM, linear separability): [PDF](https://users.wpi.edu/~jrwhitehill/CS453X_2018_Lectures/CS453X_2018_Lecture12.pdf)
13. Lecture (kernelization): [PDF](https://users.wpi.edu/~jrwhitehill/CS453X_2018_Lectures/CS453X_2018_Lecture13.pdf)
14. Lecture (more on kernelization): [PDF](https://users.wpi.edu/~jrwhitehill/CS453X_2018_Lectures/CS453X_2018_Lecture14.pdf)
15. Lecture (Gaussian RBF kernel, nearest neighbors): [PDF](https://users.wpi.edu/~jrwhitehill/CS453X_2018_Lectures/CS453X_2018_Lecture15.pdf)
16. Lecture (principal component analysis): [PDF](https://users.wpi.edu/~jrwhitehill/CS453X_2018_Lectures/CS453X_2018_Lecture16.pdf)
17. Lecture (k-means): [PDF](https://users.wpi.edu/~jrwhitehill/CS453X_2018_Lectures/CS453X_2018_Lecture17.pdf)
18. Lecture (introduction to neural networks): [PDF](https://users.wpi.edu/~jrwhitehill/CS453X_2018_Lectures/CS453X_2018_Lecture18.pdf)
19. Lecture (more on neural networks, XOR problem): [PDF](https://users.wpi.edu/~jrwhitehill/CS453X_2018_Lectures/CS453X_2018_Lecture19.pdf)
20. Lecture (gradient descent for neural networks, Jacobian matrices): [PDF](https://users.wpi.edu/~jrwhitehill/CS453X_2018_Lectures/CS453X_2018_Lecture20.pdf)
21. Lecture (chain rule and backpropagation): [PDF](https://users.wpi.edu/~jrwhitehill/CS453X_2018_Lectures/CS453X_2018_Lecture21.pdf)
22. Lecture (L1 and L2 regularization, dropout): [PDF](https://users.wpi.edu/~jrwhitehill/CS453X_2018_Lectures/CS453X_2018_Lecture22.pdf)
23. Lecture (unsupervised pre-training, auto-encoders): [PDF](https://users.wpi.edu/~jrwhitehill/CS453X_2018_Lectures/CS453X_2018_Lecture23.pdf)
24. Lecture (convolution, pooling): [PDF](https://users.wpi.edu/~jrwhitehill/CS453X_2018_Lectures/CS453X_2018_Lecture24.pdf)
25. Lecture (convolutional neural networks, recurrent neural networks): [PDF](https://users.wpi.edu/~jrwhitehill/CS453X_2018_Lectures/CS453X_2018_Lecture25.pdf)
26. Lecture (practical suggestions): [PDF](https://users.wpi.edu/~jrwhitehill/CS453X_2018_Lectures/CS453X_2018_Lecture26.pdf)

## Supplemental Things I Plan To Use:
* [StatQuest YouTube Channel](https://www.youtube.com/c/joshstarmer/videos)
* [Intuitive Machine Learning YouTube Channel](https://www.youtube.com/c/IntuitiveMachineLearning/videos)
* [An Introduction To Statistical Learning With Applications In R](https://faculty.marshall.usc.edu/gareth-james/ISL/ISLR%20Seventh%20Printing.pdf)
* [Pattern Recognition and Machine Learning](https://www.microsoft.com/en-us/research/uploads/prod/2006/01/Bishop-Pattern-Recognition-and-Machine-Learning-2006.pdf)
* [Python Data Science Handbook](https://jakevdp.github.io/PythonDataScienceHandbook/)
* [Hands-On Machine Learning with Scikit-Learn and TensorFlow](http://index-of.es/Varios-2/Hands%20on%20Machine%20Learning%20with%20Scikit%20Learn%20and%20Tensorflow.pdf)
* [Hands On Machine Learning github](https://github.com/ageron/handson-ml)
* [Helpful Stuff](http://tullo.ch/)

## Homework
Homework questions come from the end of each applicable chapter in "An Introduction To Statistical Learning With Applications In R" Or "Pattern Recognition and Machine Learning". These are ideally done in python and not in R, however...

For answers for "An Introduction To Statistical Learning With Applications In R" refer to:
* https://rpubs.com/ppaquay
* http://yahwes.github.io/ISLR/
* https://github.com/yahwes/ISLR
* https://altaf-ali.github.io/ISLR/index.html
* https://blog.princehonest.com/stat-learning/
* https://github.com/asadoughi/stat-learning
* https://www.kaggle.com/lmorgan95/notebooks
* For Answers Specific In Python:
* https://botlnec.github.io/islp/
* [Introduction To Statistical Learning With Application In Python](https://github.com/tdpetrou/Machine-Learning-Books-With-Python/tree/master/Introduction%20to%20Statistical%20Learning)

For answers for "Pattern Recognition and Machine Learning" refer to:
* [Chapter Overviews and Solutions](https://tommyodland.com/files/edu/bishop_solutions.pdf)
* [github of the solution stuff](https://github.com/zhengqigao/PRML-Solution-Manual)
* [Official Solution Manual](https://github.com/zhengqigao/PRML-Solution-Manual/blob/master/Solution%20Manual%20For%20PRML.pdf)
* [Another Official Looking Solution Set](https://www.microsoft.com/en-us/research/wp-content/uploads/2016/05/prml-web-sol-2009-09-08.pdf)

For answers for "The Elements of Statistical Learning Data Mining, Inference, and Prediction" refer to:
* [A Solution Manual and Notes for: The Elements of Statistical Learning by Jerome Friedman, Trevor Hastie, and Robert Tibshirani](https://waxworksmath.com/Authors/G_M/Hastie/WriteUp/Weatherwax_Epstein_Hastie_Solution_Manual.pdf)
* [A GUIDE AND SOLUTION MANUAL TO THE ELEMENTS OF STATISTICAL LEARNING by JAMES CHUANBING MA](https://getd.libs.uga.edu/pdfs/ma_james_c_201412_ms.pdf)
* [Solutions to Select Problems of The Elements of Statistical Learning by talwarabhimanyu](https://github.com/talwarabhimanyu/my-solutions-The-Elements-of-Statistical-Learning)
* [Elements of Statistical Learning (Solutions) by Andrew Tulloch](http://tullo.ch/static/ESL-Solutions.pdf)

For answers for "Machine Learning: A Probabilistic Perspective" refer to:
* [Solutions by ArthurZC23](https://github.com/ArthurZC23/Machine-Learning-A-Probabilistic-Perspective-Solutions)
* [Solutions by MLAPP-solution-CN](https://github.com/MLAPP-solution-CN/Solutions-to-Machine-Learning-A-Probabilistic-Perspective-/blob/master/sol_1_to_21.pdf)

## Math Resources To Help
* Probability And Statistics
* [Khan Academy Probability and Statistics](https://www.khanacademy.org/math/statistics-probability)
* [Statistics](https://www.youtube.com/playlist?list=PL5102DFDC6790F3D0)
* [Probabilistic Systems Analysis and Applied Probability](https://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-041sc-probabilistic-systems-analysis-and-applied-probability-fall-2013/)
* Precalculus and Calculus 1-3
* [Precalculus](https://www.youtube.com/playlist?list=PLDesaqWTN6ESsmwELdrzhcGiRhk5DjwLP)
* [Calculus 1](https://www.youtube.com/playlist?list=PLF797E961509B4EB5)
* [Calculus 2](https://www.youtube.com/playlist?list=PLDesaqWTN6EQ2J4vgsN1HyBeRADEh4Cw-)
* [Calculus 3](https://www.youtube.com/playlist?list=PLDesaqWTN6ESk16YRmzuJ8f6-rnuy0Ry7)
* [Calculus 1-3](https://www.youtube.com/user/amarchese22/playlists?disable_polymer=1)
* Linear Algebra
* [Linear Algebra](https://ocw.mit.edu/courses/mathematics/18-06sc-linear-algebra-fall-2011/)
* [Matrix Methods in Data Analysis, Signal Processing, and Machine Learning](https://ocw.mit.edu/courses/mathematics/18-065-matrix-methods-in-data-analysis-signal-processing-and-machine-learning-spring-2018/)

## Fequently Asked Questions
* Where is your section on Deep Learning?
* [Here](https://github.com/BeeGassy/CS-541-Deep_Learning)
* There is a lot of math here, how do I get into machine learning without a lot of math?
* Unfortunately, there really is no way to truly learn machine learning without the math. You may be looking for an applied way to learn machine learning which requires less math. The intent of this repository is to teach the theory from the ground up.
* Oh my lord! I need a service that will efficiently compile all possible hotels near the area im visiting and show me the cheapest option in a singlar place.
* Not a question but I believe you are looking for [trivago](https://www.trivago.com/)

## Music I Listened To During This Journey
* Lofi-Kinda
* [OP-1 05-28-17 (I Need U)](https://www.youtube.com/watch?v=7z4hoazra_g)
* [code-fi / lofi beats to code/relax to](https://www.youtube.com/watch?v=f02mOEt11OQ&t=221s)
* [RAINING IN NAGOYA (Lofi HipHop) Extended Version](https://www.youtube.com/watch?v=0te6noMKffA&list=WL&index=120&t=20s)
* [코딩 / 과제 할 때 집중 해서 듣기 좋은 음악🌃 | 3 hour playlist | beats to coding to | lofi, jazz, hiphop |](https://www.youtube.com/watch?v=0xJxgvJO2Xo&list=WL&index=121&t=62s)

* BROWN
* [SKSSSSHHHHHHHHHHH](https://www.youtube.com/watch?v=RqzGzwTY-6w&t=5361s)