Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/jeonghunyoon/machine-learning-lecture-notes

Lecture notes and codes for machine learning
https://github.com/jeonghunyoon/machine-learning-lecture-notes

data-science decision-tree deep-learning lecture-notes linear-algebra linear-regression lsa machine-learning naive-bayes-classifier statistics

Last synced: about 1 month ago
JSON representation

Lecture notes and codes for machine learning

Awesome Lists containing this project

README

        

# machine-learning-lecture-notes

이 강의자료는 데이터 사이언티스트 스쿨에서 사용하는 강의 교재입니다. 머신러닝 및 데이터과학 그리고 딥러닝을 다루고 있습니다.

### [Lecture01 (02/20) : 머신러닝 입문을 위한 간단한 튜토리얼](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture01_Machine_Learning_Simple_Tutorial.ipynb)
머신러닝 입문을 위한 간단한 튜토리얼입니다. 캘리포니아 지역의 (블록내의) 집 값을 예측하는 튜토리얼이며 머신러닝 프로젝트의 시작부터 끝까지를 간단하게 보여주고 있습니다. (핸즈 온 머신러닝)

### [Lecture02 (02/23) : 확률론](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture02_Probabilities.pdf?flush_cache=true)
기초 확률론에 대한 수리통계학 강의자료입니다.

### [Lecture03 (02/23) : 확률분포 1](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture03_Probability_Distribution_01.pdf?flush_cache=true)
확률 분포에 대한 수리통계학 강의자료1 입니다.

### [Lecture03 (02/23) : 확률분포 2](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture03_Probability_Distribution_02.pdfd )
확률 분포에 대한 수리통계학 강의자료2 입니다.
- [Probability Distribution python code](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture03_Probability_Distribution.ipynb?flush_cache=true)

### [Lecture04 (02/27) : 선형대수 기초 01 (벡터)](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture04_Linear_Algebra_Basic_Vector.pdf?flush_cache=true)
선형 대수에서 벡터 내용입니다.

### [Lecture04 (02/27) : 선형대수 기초 02 (행렬)](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture04_Linear_Algebra_Basic_Matrix.pdf?flush_cache=true)
선형대수 행렬 내용입니다.
- [Vector And Matrix Python Code](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture04_Sub_Vectors-and-Matrices.ipynb?flush_cache=true)

### [Lecture05 (03/02) : Numpy](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture05_Numpy.ipynb?flush_cache=true)
Data 분석에 필요한 Numpy 기본편 입니다.
- [KNN Numpy Python Code](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture05_Sub_KNN_Using_Numpy.ipynb?flush_cache=true)
- [Numpy Problem Set](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture05_Sub_Numpy_Problem01.ipynb?flush_cache=true)

### [Lecture06 (03/06) : 선형대수 기초 03 (변환)](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture06_Spectral_Theorem_Transformation.pdf?flush_cache=true)
선형대수 변환 내용입니다.

### [Lecture06 (03/06) : 선형대수 기초 04 (고유값/고유벡터)](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture06_Spectral_Theorem_Eigenvalue.pdf?flush_cache=true)
선형대수 고유값/고유벡터 내용입니다.
- [LSA From The Scratch Python Code 1](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture06_Sub_LSA.ipynb?flush_cache=true)
- [LSA From The Scratch Python Code 2](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture06_Sub_LSA_2.ipynb?flush_cache=true)

### [Lecture07 (03/09) : Pandas 1](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture07_Pandas_1.ipynb?flush_cache=true)
데이터 분석에 필요한 Pandas 기본편 1 입니다.

### [Lecture08 (03/13) : Pandas 2](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture07_Pandas_2.ipynb?flush_cache=true)
데이터 분석에 필요한 Pandas 기본편 2 입니다.
- [Data Analysis Example](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture07_Sub_Pandas_Analysis_Examples.ipynb?flush_cache=true)
- [Pandas Problem Set](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture07_Sub_Pandas_Problem01.ipynb?flush_cache=true)
- [Matplotlib_01](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture08_Matplotlib_1.ipynb?flush_cache=true) : 데이터 시각화 툴인 Matplotlib 1 입니다.
- [Matplotlib_02](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture08_Matplotlib_2.ipynb?flush_cache=true) : 데이터 시각화 툴인 Matplotlib 1 입니다.

### [Lecture09 (03/16) : 기본 미적분학](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture09_Gradient.pdf)
Gradient를 계산할 때 사용되는 기본 미적분학 강의자료입니다.
- [Gradient Python Code 1](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture09_Sub_Gradient_01.ipynb)
- [Gradient Python Code 2](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture09_Sub_Gradient_02.ipynb)

### [Lecture10 (03/20) : EDA](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture10_EDA.ipynb)
- EDA에 대한 강의 자료입니다.

### [Lecture10 (03/20) : Statistics in Data Science 1](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture10_Statistics_In_Data_Science_1.ipynb)
- Data Science 에서 필요한 통계 내용에 대한 강의 자료1 입니다.

### [Lecture10 (03/23) : Statistics in Data Science 2](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture10_Statistics_In_Data_Science_2.ipynb)
- Reading material
- [추정 이론](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture10_Estimation_Theory.pdf?flush_cache=true) :
추정 이론에 대한 수리통계학 강의자료 입니다.
- [표본 분포 이론](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture10_Sample_Distribution.pdf?flush_cache=true) : 표본 분포에 대한 수리통계학 강의자료 입니다.
- [가설 검정 이론 1](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture10_Hypothesis_Testing_01.pdf?flush_cache=true) : 가설 검정 이론에 대한 수리통계학 강의자료1 입니다.
- [가설 검정 이론 2](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture10_Hypothesis_Testing_02.pdf?flush_cache=true) : 가설 검정 이론에 대한 수리통계학 강의자료2 입니다.
- [분산분석](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture10_Anova.pdf?flush_cache=true) : 분산 분석(ANOVA) 대한 수리통계학 강의 자료입니다.
- [상관분석](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture10_Correlation.pdf?flush_cache=true) : 상관 분석에 대한 수리통계학 강의 자료입니다.
- Python code
- [Hypothesis python code](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture06_Hypothesis_Test.ipynb?flush_cache=true)

### [Lecture11 (03/27) : 머신러닝 기초 (Machine Learning Basic)](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture11_Basic_Concept_of_Machine_Learning.pdf)
- Machine learning의 basic concept에 대한 강의 자료입니다.
- Reading material
- Machine Learning (Tom Mitchell)
- [The Discipline of Machine Learning](http://www.cs.cmu.edu/~tom/pubs/MachineLearning.pdf)

### [Lecture12 (03/30) : 베이지안 결정 이론(Bayesian Decision Theory)](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture12_Bayesian_Decision_Thoery.pdf)
Bayesian Decision Theory에 대한 강의 자료입니다.
- [Lecture12 sub notes: MLE/MAP](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture12_MLE_MAP.pdf) : MLE/MAP에 대한 강의 자료입니다.
- [Naive Bayesian From The Scratch](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture11_Sentiment_Classifier_Using_Naive_Bayes_From_The_Scratch.ipynb) : Naive bayesian을 이용한 sentiment analysis 1 입니다.
- [Naive Bayesian Sklearn Code](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture11_Sentiment_Classifier_Using_Naive_Bayes_With_SKlearn.ipynb) : Naive bayesian을 이용한 sentiment analysis 2 입니다.
- [GNB Sklearn Code](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture11_Gaussian_Naive_Bayes.ipynb) : GNB를 이용하여 iris 데이터를 분류하는 예제입니다.
- Reading material
- [Stanford CS229 : Generative Learning algorithms](http://cs229.stanford.edu/notes/cs229-notes2.pdf)
- Pattern Recoginition and Machine Learning(Bishop): 1.5 Decision Theory (번역본있음)
- Pattern Classification(Duda) : 2. Bayesian Decision Theory (번역본있음)

### [Lecture13 (04/03, 04/06) : 선형 회귀 분석(Linear Regression)](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture13_Linear_Regression.pdf?flush_cache=true)
Linear Regression에 대한 강의 자료입니다.
- [Linear Regression Statistical models](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture13_Linear_Regression_Stat_Model.ipynb)
- [Linear Regression ML models](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture13_Linear_Regression_ML_Model.ipynb)
- Reading material
- [A Few Useful Things to Know about Machine Learning](https://homes.cs.washington.edu/~pedrod/papers/cacm12.pdf) : 머신러닝에 굉장히 유용한 논문입니다.
- [Stanford CS229 : Linear Regression](http://cs229.stanford.edu/notes/cs229-notes1.pdf)

### [Lecture14 (04/10) : 분류(Classification) 1](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture14_Binary_Classification_MNIST.ipynb)
Binary Classification에 대한 강의 자료입니다. (핸즈온 머신러닝)

### [Lecture14 (04/10) : 분류(Classification) 2](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture14_Multiclass_Classification_MNIST.ipynb)
Muticlass Classification에 대한 강의 자료입니다. (핸즈온 머신러닝)

### [Lecture15 (04/13, 04/24) : 로지스틱 회귀(Logistic Regression)](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture15_Logistic_Regression.pdf)
Logistic Regression에 대한 강의 자료입니다.
- [Logistic Regression Statsmodels Code](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture15_Logistic_Regression_Stat.ipynb)
- [Logistic Regression Sklearn Code](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture15_Logistic_Regression_ML.ipynb)
- Reading material
- [Stanford CS229 : Logistic Regression](http://cs229.stanford.edu/notes/cs229-notes1.pdf)

### [Lecture16 (04/27) : 정보이론 및 결정트리(Information theory / Decision Tree)](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture16_Decision_Tree_Information_Theory.pdf)
Decision tree에 대한 강의 자료입니다.
- [Decision Tree Python Code 1](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture16_Decision_Tree_Python_1.ipynb)
- [Decision Tree Python Code 2](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture16_Decision_Tree_Python_2.ipynb)

### [Lecture17 (05/04) : Ensemble 1 (Bagging, Random forest)](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture17_Ensemble_Models_1.pdf)
Ensemble method 중에서 bagging과 Random forest에 대한 강의 자료입니다.
- [Ensemble - Bagging, Random Forest Python Code](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture17_Ensemble_1.ipynb)
- [Creating simple Random Forest](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture17_Creating_Random_Forest.ipynb)

### [Lecture17 (05/08) : Ensemble 2 (Boosting)](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture17_Ensemble_Models_2.pdf)
Ensemble method 중에서 boosting과 stacking에 대한 강의 자료입니다.
- [Ensemble - Boosting Python Code](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture17_Ensemble_2.ipynb)
- [Ensemble - Stacking Python Code](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture17_Stacking.ipynb)
- Reading material
- [Stanford lecture note](https://web.stanford.edu/~hastie/TALKS/boost.pdf)
- [Northeastern Univ. lecture note](http://www.ccs.neu.edu/home/vip/teach/MLcourse/4_boosting/slides/gradient_boosting.pdf)
- [Stanford CS229 : Ensemble](http://cs229.stanford.edu/notes/cs229-notes-ensemble.pdf)

### [Lecture18 (05/15) : 서포트 벡터 머신(Support Vector Machine)](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture18_SVM.pdf)
SVM에 대한 강의 자료입니다.
- [SVM Python Code](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture18_SVM.ipynb) : SVM Python Code 강의 자료입니다.
- Reading material
- [Stanford CS229 : SVM](http://cs229.stanford.edu/notes/cs229-notes3.pdf) : SVM 참고자료 입니다.
- [Stanford lecture note for SMO](http://cs229.stanford.edu/materials/smo.pdf) : SMO 참고자료 입니다.
- [Convex Optimization Lecture Note](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture18_Convex_optimization.pdf) : Convex optimization을 정리한 내용 입니다.
- [Convex Optimization Book](https://web.stanford.edu/~boyd/cvxbook/bv_cvxbook.pdf)

### [Lecture19 (05/22) : 주성분 분석(Principal Components Analysis)](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture19_PCA.pdf)
PCA에 대한 강의 자료입니다.
- [PCA Python Code 1](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture19_PCA_By_Hands.ipynb) : PCA from the scratch 입니다. PCA를 직접 손으로 구현해 봅니다.
- [PCA Python Code 2](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture19_PCA.ipynb) : Sklearn을 활용하여, PCA를 시각화하고, 적절한 주성분의 갯수를 찾는 방법에 관한 자료입니다.
- [PCA Python Code 3](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture19_Dimesional_Reduction_With_PCA.ipynb) : Kaggle의 benz문제를 PCA를 활용하여 해결하는 자료입니다.
- Reading materials
- [Stanford CS229 : PCA](http://cs229.stanford.edu/notes/cs229-notes10.pdf)

### [Lecture20 (06/01) : K-means clustering](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture20_K_means_clustering.pdf)
K-means algorithms에 대한 강의 자료입니다.
- [K-means Python Code 1](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture20_K_Means_Clustering.ipynb) : K-means clustering 및 적절한 클러스터의 갯수를 찾는 방법에 관한 자료입니다.
- [K-means Python Code 2](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture21_Clustering.ipynb) : K-means python code 입니다.
- Reading materials
- [Stanford CS229 : K-means](http://cs229.stanford.edu/notes/cs229-notes7a.pdf)
- [Stanford CS229 : Gaussian Mixture Model](http://cs229.stanford.edu/notes/cs229-notes7b.pdf)

### Lecture21 (06/05) : Clustering
- [Segmentations 1](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture20_Customer_Segmentation_Easy_version.ipynb) : UCI e-commerce data를 활용하여 user segmentation을 실습하는 간단한 자료입니다.
- [Segmentations 2](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture20_Customer_Segmentation_Full_Version.ipynb) : UCI e-commerce data를 활용하여 user segmentation을 실습하는 분석난이도가 있는 자료입니다.

### [Lecture22 (06/08) : 심층 신경망 기본 + Keras(Deep Neural Network)](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture22_DNN.pdf)
Perceptron 및 Deep Neural Network에 대한 강의자료입니다.
- [DNN Python Code 1](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture22_DNN.ipynb) : Keras를 이용하여 기본적인 분류 및 회귀문제를 DNN을 이용하여 해결하는 예제입니다.
- [DNN and Sampling Python Code 1](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture22_CardFraudDetection.ipynb) : Kaggle의 card fraud detection 문제를 해결하며, imbalanced 문제를 다루는 예제입니다.
- Reading material
- [Stanford CS229 : Perceptron](http://cs229.stanford.edu/notes/cs229-notes-deep_learning.pdf)
- [Stanford CS229 : Back Propergation](http://cs229.stanford.edu/notes/cs229-notes-backprop.pdf)
- [Google Dev Demo](https://google-developers.appspot.com/machine-learning/crash-course/backprop-scroll/)

### [Lecture23 (06/12) : Feature selection](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture23_Feature_Selection.pdf)
Feature selection에 대한 강의 자료입니다.
- [Feature selection Python Code 1](https://scikit-learn.org/stable/modules/feature_selection.html)

### Lecture24 (06/15) : Time-series-analysis
- [ARIMA 1](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture24_Time_Series_01.ipynb)
- [ARIMA 2](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture24_Time_Series_02.ipynb)

### [Lecture25 : Association-rule-mining](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture25_Association_Rule_Mining.pdf)
Association Rule에 대한 강의 자료입니다.

### [Lecture26 : Topic modeling](https://nbviewer.jupyter.org/github/jeonghunyoon/machine-learning-lecture-notes/blob/master/Lecture26_Topic_models.pdf)
토픽 모델링에 대한 강의 자료입니다.