Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/ChanChiChoi/awesome-automl
collecting related resources of automated machine learning here
https://github.com/ChanChiChoi/awesome-automl
List: awesome-automl
autodl automated-machine-learning automl
Last synced: 7 days ago
JSON representation
collecting related resources of automated machine learning here
- Host: GitHub
- URL: https://github.com/ChanChiChoi/awesome-automl
- Owner: ChanChiChoi
- Created: 2018-05-14T03:51:37.000Z (over 6 years ago)
- Default Branch: master
- Last Pushed: 2022-06-08T08:40:28.000Z (over 2 years ago)
- Last Synced: 2024-10-23T13:32:44.443Z (12 days ago)
- Topics: autodl, automated-machine-learning, automl
- Homepage:
- Size: 132 KB
- Stars: 56
- Watchers: 10
- Forks: 16
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
- awesome-automated-machine-learning - ChanChiChoi/awesome-automl
- ultimate-awesome - awesome-automl - Collecting related resources of automated machine learning here. (Other Lists / PowerShell Lists)
README
# awesome-automl
[Papers](https://github.com/ChanChiChoi/awesome-automl#papers)
[blogs & articles & book](https://github.com/ChanChiChoi/awesome-automl#blogs--articles--book)
[Libraries](https://github.com/ChanChiChoi/awesome-automl#libraries)
[Projects](https://github.com/ChanChiChoi/awesome-automl#projects)
[benchmark](https://github.com/ChanChiChoi/awesome-automl#benchmark)collecting related resources of automated machine learning here. some links were from below,keyword:"automl,autodl,automated machine learning;hyperparameter optimization;neural architecture search"
- [][hibayesian/awesome-automl-papers](https://github.com/hibayesian/awesome-automl-papers)
- [x][literature-on-neural-architecture-search](https://www.ml4aad.org/automl/literature-on-neural-architecture-search/)
- [x][Algorithm Configuration Literature](http://aclib.net/acbib/)
- [][windmaple/awesome-AutoML](https://github.com/windmaple/awesome-AutoML)
- [][DataSystemsGroupUT/AutoML_Survey](https://github.com/DataSystemsGroupUT/AutoML_Survey)
- [][D-X-Y/Awesome-NAS](https://github.com/D-X-Y/Awesome-NAS)
- [][D-X-Y/AutoDL-Projects](https://github.com/D-X-Y/AutoDL-Projects)
- [][HuaizhengZhang/Awesome-System-for-Machine-Learning](https://github.com/HuaizhengZhang/Awesome-System-for-Machine-Learning)
- [][dragen1860/awesome-AutoML](https://github.com/dragen1860/awesome-AutoML)
- [][lihanghang/Deep-learning-And-Paper](https://github.com/lihanghang/Deep-learning-And-Paper)
- [][anonymone/Neural-Architecture-Search](https://github.com/anonymone/Neural-Architecture-Search)
- [][ssheikholeslami/automl-resources](https://github.com/ssheikholeslami/automl-resources)
- [][guan-yuan/awesome-AutoML-and-Lightweight-Models](https://github.com/guan-yuan/awesome-AutoML-and-Lightweight-Models)
- [][JaimeTang/AutoML](https://github.com/JaimeTang/AutoML)
- [][Yipeng-Sun/AutoML-NAS-papers](https://github.com/Yipeng-Sun/AutoML-NAS-papers)
- [][Yejining/Survey](https://github.com/Yejining/Survey)
- [][lidderupk/automl-intro](https://github.com/lidderupk/automl-intro)
- [][oskar-j/awesome-auto-ml](https://github.com/oskar-j/awesome-auto-ml)
- [][BruceQFWang/Meta-learning-Paper-List](https://github.com/BruceQFWang/Meta-learning-Paper-List)
- [][YIWEI-CHEN/awesome-automated-machine-learning](https://github.com/YIWEI-CHEN/awesome-automated-machine-learning)
- [][pbiecek/automl_resources](https://github.com/pbiecek/automl_resources)
- [][jphall663/automl_resources](https://github.com/jphall663/automl_resources)
- [][LevineHuang/AutoML-Tutorials](https://github.com/LevineHuang/AutoML-Tutorials)
- [][eug/ai-resources](https://github.com/eug/ai-resources)
- [][alvesmarcos/research-thesis](https://github.com/alvesmarcos/research-thesis)--------------------------------
keyword:"meta learning"
- [][floodsung/Meta-Learning-Papers](https://github.com/floodsung/Meta-Learning-Papers)
- [][sudharsan13296/Awesome-Meta-Learning](https://github.com/sudharsan13296/Awesome-Meta-Learning)
- [][dragen1860/awesome-meta-learning](https://github.com/dragen1860/awesome-meta-learning)
- [][oneHuster/Meta-Learning-Papers](https://github.com/oneHuster/Meta-Learning-Papers)
- [][csyanbin/Few-shot-Meta-learning-papers](https://github.com/csyanbin/Few-shot-Meta-learning-papers)
- [][ha-lins/MetaLearning4NLP-Papers](https://github.com/ha-lins/MetaLearning4NLP-Papers)
- [][johnnyasd12/awesome-few-shot-meta-learning](https://github.com/johnnyasd12/awesome-few-shot-meta-learning)
- [][jarvisWang0903/Meta-Learning-PaperReading](https://github.com/jarvisWang0903/Meta-Learning-PaperReading)
- [][Deepest-Project/meta-learning-study](https://github.com/Deepest-Project/meta-learning-study)
- [][metarl/awesome-metarl](https://github.com/metarl/awesome-metarl)
- [][BruceQFWang/Meta-learning-Paper-List](https://github.com/BruceQFWang/Meta-learning-Paper-List)
- [][rootlu/MetaLearningPapers](https://github.com/rootlu/MetaLearningPapers)
- [][Meta-Learning/Awesome-Meta-Learning](https://github.com/Meta-Learning/Awesome-Meta-Learning)
- [][anthonysicilia/MetaLearningPapers](https://github.com/anthonysicilia/MetaLearningPapers)
- [][dragen1860/Meta-Learning-Papers)](https://github.com/dragen1860/Meta-Learning-Papers)
- [][sorrowyn/awesome-metarl-2](https://github.com/sorrowyn/awesome-metarl-2)
- [][Alro10/meta-learning-resources](https://github.com/Alro10/meta-learning-resources)
- [][clxiao/Meta-Learning-Papers](https://github.com/clxiao/Meta-Learning-Papers)
- [][adityagupte95/Meta-Learning-Papers](https://github.com/adityagupte95/Meta-Learning-Papers)you can take part in [automl Challenge](http://automl.chalearn.org/),[autodl Challenge](https://autodl.chalearn.org/)
or find competitions in [kaggle](https://www.kaggle.com)
or get search result from [reddit](https://www.reddit.com/search?q=automl), [bing](https://cn.bing.com/search?q=automated+machine+learning&FORM=BESBTB&ensearch=1), [quora](https://www.quora.com/search?q=automl)(search keyword should be "automatic machine learning","automl","meta learning","automated machine learning" and so on),
or access the website [automl](http://www.ml4aad.org),
or search your keyword in [arxiv papers info](https://github.com/ChanChiChoi/tiny-crawler/tree/master/paperMeta4arxiv),
or others to find some perfect resources there.---
This papers or books or slides are ordered by years, before each entity is the theme the entity belonged, if you want to choice one theme, e.g. "Architecture Search", you can **_ctrl+F_** then highlight the papers.
Themes are as follow:
- 1.【Architecture Search】:
【Random Search】; 【Evolutionary Algorithms】;【Transfer Learning】;【Reinforcement Learning】;【Local Search】;
- 2.【Hyperparameter Optimization】:
【Bayesian Optimization】;【Meta Learning】;【Particle Swarm Optimization】;【Lipschitz Functions】;【Random Search】;【Transfer Learning】;【Local Search】;
- 3.【Multi-Objective NAS】;
- 4.【Automated Feature Engineering】;【Reinforcement Learning】;【Meta Learning】;
- 5.【Frameworks】;
- 6.【Meta Learning】;
- 7.【Miscellaneous】ps:The theme is a bit confusing and I will modify it later.
---
## Papers#### 1990
- 【Architecture Search】Fahlman, Scott E and Lebiere, Christian. [The cascade correlation learning architecture](http://papers.nips.cc/paper/207-the-cascade-correlation-learning-architecture.pdf). In NIPS, pp. 524–532,1990.#### 2002
- 【Architecture Search】【Evolutionary Algorithms】Stanley K O, Miikkulainen R. [Evolving neural networks through augmenting topologies](http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.28.5457&rep=rep1&type=pdf)[J]. Evolutionary computation, 2002, 10(2): 99-127.#### 2008
- 【Tutorials】【Meta Learning】[Metalearning - A Tutorial](https://pdfs.semanticscholar.org/5794/1a4891f673cadf06fba02419372aad85c3bb.pdf)
- 【Hyperparameter Optimization】【Particle Swarm Optimization】Lin S W, Ying K C, Chen S C, et al. [Particle swarm optimization for parameter determination and feature selection of support vector machines](http://www.sciencedirect.com/science/article/pii/S0957417407003752)[J]. Expert systems with applications, 2008, 35(4): 1817-1824.
- 【Hyperparameter Optimization】【Meta Learning】Smith-Miles K A. [Cross-disciplinary perspectives on meta-learning for algorithm selection](https://dl.acm.org/citation.cfm?id=1456656)[J]. ACM Computing Surveys (CSUR), 2009, 41(1): 6.
- 【Architecture Search】【Evolutionary Algorithms】Floreano, Dario, D¨urr, Peter, and Mattiussi, Claudio. [Neuroevolution:from architectures to learning](https://infoscience.epfl.ch/record/112676/files/FloreanoDuerrMattiussi2008.pdf). Evolutionary Intelligence, 1(1):47–62, 2008#### 2009
- 【Hyperparameter Optimization】【Local Search】Hutter F, Hoos H H, Leyton-Brown K, et al. [ParamILS: an automatic algorithm configuration framework](https://arxiv.org/pdf/1401.3492.pdf)[J]. Journal of Artificial Intelligence Research, 2009, 36: 267-306.
- 【Architecture Search】【Evolutionary Algorithms】Stanley, Kenneth O, D’Ambrosio, David B, and Gauci, Jason. [A hypercube-based encoding for evolving large-scale neural networks](http://axon.cs.byu.edu/Dan/778/papers/NeuroEvolution/stanley3**.pdf). Artificial life, 15(2):185–212, 2009#### 2010
- 【Bayesian Optimization】Brochu E, Cora V M, De Freitas N. [A Tutorial on Bayesian Optimization of Expensive Cost Functions, with Application to Active User Modeling and Hierarchical Reinforcement Learning](https://arxiv.org/pdf/1012.2599v1.pdf)[J]. arXiv preprint arXiv:1012.2599, 2010.
- 【Automated Feature Engineering】【Reinforcement Learning】Gaudel R, Sebag M. [Feature selection as a one-player game](https://hal.inria.fr/docs/00/48/40/49/PDF/fuse_icml10.pdf)[C]//International Conference on Machine Learning. 2010: 359--366.#### 2011
- 【Hyperparameter Optimization】【Random Search】Bergstra J S, Bardenet R, Bengio Y, et al. [Algorithms for hyper-parameter optimization](https://dl.acm.org/citation.cfm?id=2986743)[C]//Advances in neural information processing systems. 2011: 2546-2554.
- 【Hyperparameter Optimization】【Bayesian Optimization】Hutter F, Hoos H H, Leyton-Brown K. [Sequential model-based optimization for general algorithm configuration](https://www.cs.ubc.ca/~hutter/papers/10-TR-SMAC.pdf)[C]//International Conference on Learning and Intelligent Optimization. Springer, Berlin, Heidelberg, 2011: 507-523.#### 2012
- 【Architecture Search】Snoek J, Larochelle H, Adams R P. [Practical bayesian optimization of machine learning algorithms](https://papers.nips.cc/paper/4522-practical-bayesian-optimization-of-machine-learning-algorithms.pdf)[C]//Advances in neural information processing systems. 2012: 2951-2959.
- 【Hyperparameter Optimization】【Random Search】Bergstra J, Bengio Y. [Random search for hyper-parameter optimization](http://www.jmlr.org/papers/volume13/bergstra12a/bergstra12a.pdf)[J]. Journal of Machine Learning Research, 2012, 13(Feb): 281-305.
- 【Hyperparameter Optimization】【Bayesian Optimization】Snoek J, Larochelle H, Adams R P. [Practical bayesian optimization of machine learning algorithms](https://papers.nips.cc/paper/4522-practical-bayesian-optimization-of-machine-learning-algorithms.pdf)[C]//Advances in neural information processing systems. 2012: 2951-2959.#### 2013
- 【Hyperparameter Optimization】【Transfer Learning】Bardenet R, Brendel M, Kégl B, et al. [Collaborative hyperparameter tuning](http://proceedings.mlr.press/v28/bardenet13.pdf)[C]//International Conference on Machine Learning. 2013: 199-207.
- 【Hyperparameter Optimization】【Bayesian Optimization】Bergstra J, Yamins D, Cox D D. [Making a science of model search: Hyperparameter optimization in hundreds of dimensions for vision architectures](http://proceedings.mlr.press/v28/bergstra13.pdf)[J]. 2013.
- 【Hyperparameter Optimization】【Bayesian Optimization】Thornton C, Hutter F, Hoos H H, et al. [Auto-WEKA: Combined selection and hyperparameter optimization of classification algorithms](http://www.cs.ubc.ca/labs/beta/Projects/autoweka/papers/autoweka.pdf)[C]//Proceedings of the 19th ACM SIGKDD international conference on Knowledge discovery and data mining. ACM, 2013: 847-855.
- 【Hyperparameter Optimization】James Bergstra, David D. Cox. [Hyperparameter Optimization and Boosting for Classifying Facial Expressions: How good can a "Null" Model be?](https://arxiv.org/abs/1306.3476)[J]. arXiv preprint arXiv:1306.3476, 2013.#### 2014
- 【Hyperparameter Optimization】【Transfer Learning】Yogatama D, Mann G. [Efficient transfer learning method for automatic hyperparameter tuning](https://pdfs.semanticscholar.org/75f2/6734972ebaffc6b43d45abd3048ef75f15a5.pdf)[C]//Artificial Intelligence and Statistics. 2014: 1077-1085.#### 2015
- 【Hyperparameter Optimization】Dougal Maclaurin, David Duvenaud, Ryan P. Adams. [Gradient-based Hyperparameter Optimization through Reversible Learning](https://arxiv.org/abs/1502.03492)[J]. arXiv preprint arXiv:1502.03492, 2015.
- 【Hyperparameter Optimization】Kevin Jamieson, Ameet Talwalkar. [Non-stochastic Best Arm Identification and Hyperparameter Optimization](https://arxiv.org/abs/1502.07943)[J]. arXiv preprint arXiv:1502.07943, 2015.
- 【Architecture Search】【Evolutionary Algorithms】Young S R, Rose D C, Karnowski T P, et al. [Optimizing deep learning hyper-parameters through an evolutionary algorithm](https://www.researchgate.net/profile/Steven_Young11/publication/301463804_Optimizing_deep_learning_hyper-parameters_through_an_evolutionary_algorithm/links/57ac9b7c08ae3765c3bac448/Optimizing-deep-learning-hyper-parameters-through-an-evolutionary-algorithm.pdf)[C]//Proceedings of the Workshop on Machine Learning in High-Performance Computing Environments. ACM, 2015: 4.
- 【Hyperparameter Optimization】【Bayesian Optimization】Wistuba M, Schilling N, Schmidt-Thieme L. [Sequential model-free hyperparameter tuning](http://ieeexplore.ieee.org/abstract/document/7373431/)[C]//Data Mining (ICDM), 2015 IEEE International Conference on. IEEE, 2015: 1033-1038.
- 【Hyperparameter Optimization】【Bayesian Optimization】Snoek J, Rippel O, Swersky K, et al. [Scalable bayesian optimization using deep neural networks](https://dl.acm.org/citation.cfm?id=3045349)[C]//International conference on machine learning. 2015: 2171-2180.
- 【Hyperparameter Optimization】【Bayesian Optimization】Wistuba M, Schilling N, Schmidt-Thieme L. [Learning hyperparameter optimization initializations](http://ieeexplore.ieee.org/abstract/document/7344817/)[C]//Data Science and Advanced Analytics (DSAA), 2015. 36678 2015. IEEE International Conference on. IEEE, 2015: 1-10.
- 【Hyperparameter Optimization】【Bayesian Optimization】Schilling N, Wistuba M, Drumond L, et al. [Joint model choice and hyperparameter optimization with factorized multilayer perceptrons](http://ieeexplore.ieee.org/abstract/document/7372120/)[C]//Tools with Artificial Intelligence (ICTAI), 2015 IEEE 27th International Conference on. IEEE, 2015: 72-79.
- 【Hyperparameter Optimization】【Bayesian Optimization】Wistuba M, Schilling N, Schmidt-Thieme L. [Hyperparameter search space pruning–a new component for sequential model-based hyperparameter optimization](https://dl.acm.org/citation.cfm?id=2991491)[C]//Joint European Conference on Machine Learning and Knowledge Discovery in Databases. Springer, Cham, 2015: 104-119.
- 【Hyperparameter Optimization】【Bayesian Optimization】Schilling N, Wistuba M, Drumond L, et al. [Hyperparameter optimization with factorized multilayer perceptrons](https://link.springer.com/chapter/10.1007/978-3-319-23525-7_6)[C]//Joint European Conference on Machine Learning and Knowledge Discovery in Databases. Springer, Cham, 2015: 87-103.
- 【Hyperparameter Optimization】【Bayesian Optimization】【more efficient】Feurer M, Klein A, Eggensperger K, et al. [Efficient and robust automated machine learning](https://papers.nips.cc/paper/5872-efficient-and-robust-automated-machine-learning.pdf)[C]//Advances in Neural Information Processing Systems. 2015: 2962-2970.
- 【Frameworks】Thakur A, Krohn-Grimberghe A.[AutoCompete: A Framework for Machine Learning Competition](https://arxiv.org/pdf/1507.02188.pdf)[J]. arXiv preprint arXiv:1507.02188, 2015.
- 【Automated Feature Engineering】【Expand Reduce】Kanter J M, Veeramachaneni K. [Deep feature synthesis: Towards automating data science endeavors](http://www.jmaxkanter.com/static/papers/DSAA_DSM_2015.pdf)[C]//Data Science and Advanced Analytics (DSAA), 2015. 36678 2015. IEEE International Conference on. IEEE, 2015: 1-10.#### 2016
- 【Architecture Search】Mendoza H, Klein A, Feurer M, et al. [Towards automatically-tuned neural networks](http://proceedings.mlr.press/v64/mendoza_towards_2016.html)[C]//Workshop on Automatic Machine Learning. 2016: 58-65.
- 【Hyperparameter Optimization】Fabian Pedregosa. [Hyperparameter optimization with approximate gradient](https://arxiv.org/abs/1602.02355)[J]. arXiv preprint arXiv:1602.02355, 2016.
- 【Hyperparameter Optimization】【Random Search】Li L, Jamieson K, DeSalvo G, et al. [Hyperband: A novel bandit-based approach to hyperparameter optimization](https://arxiv.org/abs/1603.06560)[J]. arXiv preprint arXiv:1603.06560, 2016.
- 【Architecture Search】Loshchilov I, Hutter F. [CMA-ES for hyperparameter optimization of deep neural networks](https://arxiv.org/abs/1604.07269)[J]. arXiv preprint arXiv:1604.07269, 2016.
- 【Hyperparameter Optimization】Julien-Charles Lévesque, Christian Gagné, Robert Sabourin. [Bayesian Hyperparameter Optimization for Ensemble Learning](https://arxiv.org/abs/1605.06394)[J]. arXiv preprint arXiv:1605.06394, 2016.
- 【Make it more efficient】Klein A, Falkner S, Bartels S, et al. [Fast bayesian optimization of machine learning hyperparameters on large datasets](http://proceedings.mlr.press/v54/klein17a.html)[J]. arXiv preprint arXiv:1605.07079, 2016.
- 【Architecture Search】【Meta Learning】Li K, Malik J. [Learning to optimize](https://arxiv.org/pdf/1606.01885.pdf)[J]. arXiv preprint arXiv:1606.01885, 2016.
- 【Architecture Search】Saxena S, Verbeek J. [Convolutional neural fabrics](https://arxiv.org/abs/1606.02492)[C]//Advances in Neural Information Processing Systems. 2016: 4053-4061.
- 【Architecture Search】【Reinforcement Learning】Cortes, Corinna, Gonzalvo, Xavi, Kuznetsov, Vitaly, Mohri, Mehryar, and Yang, Scott. [Adanet: Adaptive structural learning of artificial neural networks](https://arxiv.org/abs/1607.01097). arXiv preprint arXiv:1607.01097, 2016.
- 【Hyperparameter Optimization】Ilija Ilievski, Taimoor Akhtar, Jiashi Feng, Christine Annette Shoemaker. [Efficient Hyperparameter Optimization of Deep Learning Algorithms Using Deterministic RBF Surrogates](https://arxiv.org/abs/1607.08316)[J]. arXiv preprint arXiv:1607.08316, 2016.
- 【Hyperparameter Optimization】【Transfer Learning】Ilija Ilievski, Jiashi Feng. [Hyperparameter Transfer Learning through Surrogate Alignment for Efficient Deep Neural Network Training](https://arxiv.org/abs/1608.00218). arXiv preprint arXiv:1608.00218, 2016.
- 【Architecture Search】【Reinforcement Learning】Zoph B, Le Q V. [Neural architecture search with reinforcement learning](https://arxiv.org/abs/1611.01578)[J]. arXiv preprint arXiv:1611.01578, 2016.
- 【Architecture Search】Baker B, Gupta O, Naik N, et al. [Designing neural network architectures using reinforcement learning](https://arxiv.org/abs/1611.02167)[J]. arXiv preprint arXiv:1611.02167, 2016.
- 【Make it more efficient】Klein A, Falkner S, Springenberg J T, et al. [Learning curve prediction with Bayesian neural networks](http://ml.informatik.uni-freiburg.de/papers/17-ICLR-LCNet.pdf)[J]. 2016.
- 【Hyperparameter Optimization】【Transfer Learning】Wistuba M, Schilling N, Schmidt-Thieme L. [Hyperparameter optimization machines](http://ieeexplore.ieee.org/abstract/document/7796889/)[C]//Data Science and Advanced Analytics (DSAA), 2016 IEEE International Conference on. IEEE, 2016: 41-50.
- 【Hyperparameter Optimization】【Transfer Learning】Joy T T, Rana S, Gupta S K, et al. [Flexible transfer learning framework for bayesian optimisation](https://link.springer.com/chapter/10.1007/978-3-319-31753-3_9)[C]//Pacific-Asia Conference on Knowledge Discovery and Data Mining. Springer, Cham, 2016: 102-114.
- 【Hyperparameter Optimization】【Bayesian Optimization】Wistuba M, Schilling N, Schmidt-Thieme L. [Two-stage transfer surrogate model for automatic hyperparameter optimization](https://link.springer.com/chapter/10.1007/978-3-319-46128-1_13)[C]//Joint European conference on machine learning and knowledge discovery in databases. Springer, Cham, 2016: 199-214.
- 【Hyperparameter Optimization】【Bayesian Optimization】Mendoza H, Klein A, Feurer M, et al. [Towards automatically-tuned neural networks](http://aad.informatik.uni-freiburg.de/papers/16-AUTOML-AutoNet.pdf)[C]//Workshop on Automatic Machine Learning. 2016: 58-65.
- 【Hyperparameter Optimization】【Bayesian Optimization】Shahriari B, Swersky K, Wang Z, et al. [Taking the human out of the loop: A review of bayesian optimization](http://ieeexplore.ieee.org/document/7352306/)[J]. Proceedings of the IEEE, 2016, 104(1): 148-175.
- 【Hyperparameter Optimization】【Bayesian Optimization】Schilling N, Wistuba M, Schmidt-Thieme L. [Scalable hyperparameter optimization with products of gaussian process experts](https://link.springer.com/chapter/10.1007/978-3-319-46128-1_3)[C]//Joint European conference on machine learning and knowledge discovery in databases. Springer, Cham, 2016: 33-48.
- 【Hyperparameter Optimization】【Bayesian Optimization】Springenberg J T, Klein A, Falkner S, et al. [Bayesian optimization with robust bayesian neural networks](https://papers.nips.cc/paper/6117-bayesian-optimization-with-robust-bayesian-neural-networks.pdf)[C]//Advances in Neural Information Processing Systems. 2016: 4134-4142.
- 【Automated Feature Engineering】【Hierarchical Organization of Transformations】Khurana U, Turaga D, Samulowitz H, et al. [Cognito: Automated feature engineering for supervised learning](http://ieeexplore.ieee.org/document/7836821/)[C]//Data Mining Workshops (ICDMW), 2016 IEEE 16th International Conference on. IEEE, 2016: 1304-1307.
- 【Automated Feature Engineering】【Expand Reduce】Katz G, Shin E C R, Song D. [Explorekit: Automatic feature generation and selection](http://ieeexplore.ieee.org/document/7837936/)[C]//Data Mining (ICDM), 2016 IEEE 16th International Conference on. IEEE, 2016: 979-984.
- 【Automated Feature Engineering】【Expand Reduce】Khurana U, Nargesian F, Samulowitz H, et al. [Automating Feature Engineering](http://workshops.inf.ed.ac.uk/nips2016-ai4datasci/papers/NIPS2016-AI4DataSci_paper_13.pdf)[J]. Transformation, 2016, 10(10): 10.#### 2017
- 【Architecture Search】【Evolutionary Algorithms】【more efficient】Miikkulainen, Risto, Liang, Jason, Meyerson, Elliot,Rawal, Aditya, Fink, Dan, Francon, Olivier, Raju,Bala, Navruzyan, Arshak, Duffy, Nigel, and Hodjat,Babak. [Evolving deep neural networks](https://arxiv.org/abs/1703.00548). arXiv preprint arXiv:1703.00548, 2017
- 【Architecture Search】【Hyperparameter Optimization】【Evolutionary Algorithms】Real E, Moore S, Selle A, et al. [Large-scale evolution of image classifiers](https://arxiv.org/abs/1703.01041)[J]. arXiv preprint arXiv:1703.01041, 2017.
- 【Architecture Search】【Evolutionary Algorithms】Xie, Lingxi and Yuille, Alan. [Genetic cnn](https://arxiv.org/abs/1703.01513). arXiv preprint arXiv:1703.01513, 2017.
- 【Hyperparameter Optimization】Luca Franceschi, Michele Donini, Paolo Frasconi, Massimiliano Pontil. [Forward and Reverse Gradient-Based Hyperparameter Optimization](https://arxiv.org/abs/1703.01785)[J]. arXiv preprint arXiv:1703.01785, 2017.
- 【Hyperparameter Optimization】【Lipschitz Functions】Malherbe C, Vayatis N. [Global optimization of Lipschitz functions](https://arxiv.org/pdf/1703.02628.pdf)[J]. arXiv preprint arXiv:1703.02628, 2017.
- 【Hyperparameter Optimization】【Meta Learning】Ben Goertzel, Nil Geisweiller, Chris Poulin. [Metalearning for Feature Selection
](https://arxiv.org/abs/1703.06990). arXiv preprint arXiv:1703.06990, 2017.
- 【Architecture Search】Suganuma, Masanori, Shirakawa, Shinichi, and Nagao, Tomoharu. [A genetic programming approach to designing convolutional neural network architectures](https://arxiv.org/abs/1704.00764). arXiv preprint arXiv:1704.00764, 2017.
- 【Architecture Search】Negrinho, Renato and Gordon, Geoff. [Deeparchitect: Automatically designing and training deep architectures](https://arxiv.org/abs/1704.08792). arXiv preprint arXiv:1704.08792, 2017.
- 【Hyperparameter Optimization】Gonzalo Diaz, Achille Fokoue, Giacomo Nannicini, Horst Samulowitz. [An effective algorithm for hyperparameter optimization of neural networks](https://arxiv.org/abs/1705.08520)[J]. arXiv preprint arXiv:1705.08520, 2017.
- 【Automated Feature Engineering】【Expand Reduce】Lam H T, Thiebaut J M, Sinn M, et al. [One button machine for automating feature engineering in relational databases](https://arxiv.org/pdf/1706.00327.pdf)[J]. arXiv preprint arXiv:1706.00327, 2017.
- 【Architecture Search】Hazan E, Klivans A, Yuan Y. [Hyperparameter Optimization: A Spectral Approach](https://arxiv.org/abs/1706.00764
)[J]. arXiv preprint arXiv:1706.00764, 2017.
- 【Hyperparameter Optimization】Jesse Dodge, Kevin Jamieson, Noah A. Smith. [Open Loop Hyperparameter Optimization and Determinantal Point Processes Machine Learning](https://arxiv.org/abs/1706.01566)[J]. arXiv preprint arXiv:1706.01566, 2017.
- 【Architecture Search】Huang, Furong, Ash, Jordan, Langford, John, and Schapire, Robert. [Learning deep resnet blocks sequentially using boosting theory](https://arxiv.org/abs/1706.04964). arXiv preprint arXiv:1706.04964, 2017
- 【Hyperparameter Optimization】【Meta Learning】Fábio Pinto, Vítor Cerqueira, Carlos Soares, João Mendes-Moreira. [autoBagging: Learning to Rank Bagging Workflows with Metalearning](https://arxiv.org/abs/1706.09367)[J]. arXiv preprint arXiv:1706.09367, 2017.
- 【Architecture Search】Cai H, Chen T, Zhang W, et al. [Efficient Architecture Search by Network Transformation](https://arxiv.org/abs/1707.04873)[J]. arXiv preprint arXiv:1707.04873, 2017.
- 【Architecture Search】【Transfer Learning】Zoph B, Vasudevan V, Shlens J, et al. [Learning transferable architectures for scalable image recognition](https://arxiv.org/abs/1707.07012)[J]. arXiv preprint arXiv:1707.07012, 2017.
- 【Architecture Search】【more efficient】Brock A, Lim T, Ritchie J M, et al. [SMASH: one-shot model architecture search through hypernetworks](https://arxiv.org/abs/1708.05344)[J]. arXiv preprint arXiv:1708.05344, 2017.
- 【Architecture Search】【reinforcement learning】Zhong, Zhao, Yan, Junjie, and Liu, Cheng-Lin. [Practical network blocks design with q-learning](https://arxiv.org/abs/1708.05552). arXiv preprint arXiv:1708.05552, 2017.
- 【Architecture Search】【reinforcement learning】Bello I, Zoph B, Vasudevan V, et al. [Neural optimizer search with reinforcement learning](https://arxiv.org/abs/1709.07417)[J]. arXiv preprint arXiv:1709.07417, 2017.
- 【Automated Feature Engineering】【Reinforcement Learning】Khurana U, Samulowitz H, Turaga D. [Feature Engineering for Predictive Modeling using Reinforcement Learning](https://arxiv.org/pdf/1709.07150.pdf)[J]. arXiv preprint arXiv:1709.07150, 2017.
- 【Hyperparameter Optimization】Jungtaek Kim, Saehoon Kim, Seungjin Choi. [Learning to Warm-Start Bayesian Hyperparameter Optimization](https://arxiv.org/abs/1710.06219). [J] arXiv preprint arXiv:1709.07150, 2017.
- 【Architecture Search】Liu H, Simonyan K, Vinyals O, et al. [Hierarchical representations for efficient architecture search](https://arxiv.org/abs/1711.00436). [J] arXiv preprint arXiv:1710.06219, 2017.
- 【Architecture Search】【Evolutionary Algorithms】Liu, Hanxiao, Simonyan, Karen, Vinyals, Oriol, Fernando,Chrisantha, and Kavukcuoglu, Koray. [Hierarchical representations for efficient architecture search](https://arxiv.org/abs/1711.00436). arXiv preprint arXiv:1711.00436, 2017b.
- 【Architecture Search】【Local Search】Elsken T, Metzen J H, Hutter F. [Simple and efficient architecture search for convolutional neural networks](https://arxiv.org/abs/1711.04528)[J]. arXiv preprint arXiv:1711.04528, 2017.
- 【Architecture Search】Max Jaderberg, Valentin Dalibard, Simon Osindero, Wojciech M. Czarnecki, Jeff Donahue, Ali Razavi, Oriol Vinyals, Tim Green, Iain Dunning, Karen Simonyan, Chrisantha Fernando, Koray Kavukcuoglu. [Population Based Training of Neural Networks](https://arxiv.org/abs/1711.09846)[J]. arXiv preprint arXiv:1711.09846, 2017.
- 【Architecture Search】【more efficient】Liu C, Zoph B, Shlens J, et al. [Progressive neural architecture search](https://arxiv.org/abs/1712.00559)[J]. arXiv preprint arXiv:1712.00559, 2017.
- 【Architecture Search】Wistuba M. [Finding Competitive Network Architectures Within a Day Using UCT](https://arxiv.org/abs/1712.07420)[J]. arXiv preprint arXiv:1712.07420, 2017.
- 【Hyperparameter Optimization】【Particle Swarm Optimization】Lorenzo P R, Nalepa J, Kawulok M, et al. [Particle swarm optimization for hyper-parameter selection in deep neural networks](https://dl.acm.org/citation.cfm?id=3071208)[C]//Proceedings of the Genetic and Evolutionary Computation Conference. ACM, 2017: 481-488.
- 【Frameworks】Swearingen T, Drevo W, Cyphers B, et al. [ATM: A distributed, collaborative, scalable system for automated machine learning](https://cyphe.rs/static/atm.pdf)[C]//IEEE International Conference on Big Data. 2017.
- 【Frameworks】Golovin D, Solnik B, Moitra S, et al. [Google vizier: A service for black-box optimization](https://static.googleusercontent.com/media/research.google.com/zh-CN//pubs/archive/46180.pdf)[C]//Proceedings of the 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. ACM, 2017: 1487-1495.
- 【Automated Feature Engineering】【Meta Learning】Nargesian F, Samulowitz H, Khurana U, et al. [Learning feature engineering for classification](https://www.ijcai.org/proceedings/2017/0352.pdf)[C]//Proceedings of the 26th International Joint Conference on Artificial Intelligence. AAAI Press, 2017: 2529-2535.
- 【Miscellaneous】Martin Wistuba, et al. [Automatic Frankensteining: Creating Complex Ensembles Autonomously](http://epubs.siam.org/doi/pdf/10.1137/1.9781611974973.83)#### 2018
- 【Architecture Search】【Evolutionary Algorithms】Real E, Aggarwal A, Huang Y, et al. [Regularized Evolution for Image Classifier Architecture Search](https://arxiv.org/abs/1802.01548)[J]. arXiv preprint arXiv:1802.01548, 2018.
- 【Architecture Search】【Reinforcement Learning】Pham H, Guan M Y, Zoph B, et al. [Efficient Neural Architecture Search via Parameter Sharing](https://arxiv.org/abs/1802.03268)[J]. arXiv preprint arXiv:1802.03268, 2018.
- 【Architecture Search】Kandasamy K, Neiswanger W, Schneider J, et al. [Neural Architecture Search with Bayesian Optimisation and Optimal Transport](https://arxiv.org/abs/1802.07191)[J]. arXiv preprint arXiv:1802.07191, 2018.
- 【Hyperparameter Optimization】Lorraine, Jonathan, and David Duvenaud. [Stochastic Hyperparameter Optimization through Hypernetworks](https://arxiv.org/abs/1802.09419) arXiv preprint arXiv:1802.09419 (2018).
- 【Hyperparameter Optimization】【Evolutionary Algorithms】Chen B, Wu H, Mo W, et al. [Autopythonstacker: A Compositional Evolutionary Learning System](https://arxiv.org/pdf/1803.00684.pdf)[J]. arXiv preprint arXiv:1803.00684, 2018.
- 【more efficient】Wong C, Houlsby N, Lu Y, et al. [Transfer Automatic Machine Learning](https://arxiv.org/abs/1803.02780)[J]. arXiv preprint arXiv:1803.02780, 2018.
- 【Architecture Search】Kamath P, Singh A, Dutta D. [Neural Architecture Construction using EnvelopeNets](https://arxiv.org/abs/1803.06744)[J]. arXiv preprint arXiv:1803.06744, 2018.
- 【Hyperparameter Optimization】Cui, Henggang, Gregory R. Ganger, and Phillip B. Gibbons. [MLtuner: System Support for Automatic Machine Learning Tuning](https://arxiv.org/abs/1803.07445) arXiv preprint arXiv:1803.07445 (2018).
- 【more efficient】Bennani-Smires K, Musat C, Hossmann A, et al. [GitGraph-from Computational Subgraphs to Smaller Architecture Search Spaces](https://openreview.net/pdf?id=rkiO1_1Pz)[J]. 2018.
- 【Multi-Objective NAS】Dong J D, Cheng A C, Juan D C, et al. [PPP-Net: Platform-aware Progressive Search for Pareto-optimal Neural Architectures](https://openreview.net/pdf?id=B1NT3TAIM)[J]. 2018.
- 【more efficient】Bowen Baker, et al. Baker B, Gupta O, Raskar R, et al. [Accelerating Neural Architecture Search using Performance Prediction](https://openreview.net/pdf?id=BJypUGZ0Z)[J]. 2018.
- 【Architecture Search】Huang, Siyu, et al. [GNAS: A Greedy Neural Architecture Search Method for Multi-Attribute Learning.](https://arxiv.org/abs/1804.06964) arXiv preprint arXiv:1804.06964 (2018).---
## SURVEY
- Elsken T, Metzen J H, Hutter F. [Neural architecture search: A survey](http://www.jmlr.org/papers/volume20/18-598/18-598.pdf)[J]. arXiv preprint arXiv:1808.05377, 2018.
- Yao Q, Wang M, Chen Y, et al. [Taking human out of learning applications: A survey on automated machine learning](https://arxiv.org/pdf/1810.13306)[J]. arXiv preprint arXiv:1810.13306, 2018.
- Zöller M A, Huber M F. [Benchmark and Survey of Automated Machine Learning Frameworks](https://arxiv.org/pdf/1904.12054)[J]. arXiv preprint arXiv:1904.12054, 2019.
- Wistuba M, Rawat A, Pedapati T. [A survey on neural architecture search](https://arxiv.org/pdf/1905.01392)[J]. arXiv preprint arXiv:1905.01392, 2019.
- Elshawi R, Maher M, Sakr S. [Automated machine learning: State-of-the-art and open challenges](https://arxiv.org/pdf/1906.02287)[J]. arXiv preprint arXiv:1906.02287, 2019.
- Chen Y W, Song Q, Hu X. [Techniques for Automated Machine Learning](https://arxiv.org/pdf/1907.08908)[J]. arXiv preprint arXiv:1907.08908, 2019.
- He X, Zhao K, Chu X. [AutoML: A Survey of the State-of-the-Art](https://arxiv.org/pdf/1908.00709.pdf)[J]. arXiv preprint arXiv:1908.00709, 2019.
- Truong A, Walters A, Goodsitt J, et al. [Towards automated machine learning: Evaluation and comparison of automl approaches and tools](https://arxiv.org/pdf/1908.05557.pdf)[J]. arXiv preprint arXiv:1908.05557, 2019.
- Ono J P, Castelo S, Lopez R, et al. [PipelineProfiler: A Visual Analytics Tool for the Exploration of AutoML Pipelines](https://arxiv.org/pdf/2005.00160)[J]. arXiv preprint arXiv:2005.00160, 2020.
- Halvari T, Nurminen J K, Mikkonen T. [Testing the Robustness of AutoML Systems](https://arxiv.org/pdf/2005.02649)[J]. arXiv preprint arXiv:2005.02649, 2020.
- Rippel O, Weninger L, Merhof D. [AutoML Segmentation for 3D Medical Image Data: Contribution to the MSD Challenge 2018](https://arxiv.org/pdf/2005.09978)[J]. arXiv preprint arXiv:2005.09978, 2020.
- Ren P, Xiao Y, Chang X, et al. [A Comprehensive Survey of Neural Architecture Search: Challenges and Solutions](https://arxiv.org/pdf/2006.02903)[J]. arXiv preprint arXiv:2006.02903, 2020.
- Liu Z, Bousquet O, Elisseeff A, et al. [AutoDL Challenge Design and Beta Tests-Towards automatic deep learning](https://hal.archives-ouvertes.fr/hal-01906226/file/AutoDL_challenge_design_and_beta_tests_____towards_Automatic_Deep_Learning.pdf)[C]. 2018.---
## blogs & articles & book#### 2008
- 【book】【Meta Learning】Brazdil P, Carrier C G, Soares C, et al. [Metalearning: Applications to data mining](http://www.springer.com/la/book/9783540732624)[M]. Springer Science & Business Media, 2008.#### 2016
- 【Articles】【Bayesian Optimization】[Bayesian Optimization for Hyperparameter Tuning](https://arimo.com/data-science/2016/bayesian-optimization-hyperparameter-tuning/)#### 2017
- 【Articles】【Meta Learning】[Learning to learn](http://bair.berkeley.edu/blog/2017/07/18/learning-to-learn/)
- 【Articles】【Meta Learning】[Why Meta-learning is Crucial for Further Advances of Artificial Intelligence?](https://chatbotslife.com/why-meta-learning-is-crucial-for-further-advances-of-artificial-intelligence-c2df55959adf)
- 【articles】[automl_aws_data_science](https://alexisperrier.com/aws/2017/12/04/automl_aws_data_science.html)
- 【news】[what-is-automl-promises-vs-realityauto](https://www.iotforall.com/what-is-automl-promises-vs-realityauto/)#### 2018
- 【book】Sibanjan Das, Umit Mert Cakmak - [Hands-On Automated Machine Learning](https://libgen.io/search.php?req=+automated+machine+learning+&open=0&res=25&view=simple&phrase=1&column=def) (2018, Packt Publishing)---
## Libraries
[S:Structured Data; I:Image; A:Audio; N:NLP]- [mlpapers/automl](https://github.com/mlpapers/automl):
- [shukwong/awesome_automl_libraries](https://github.com/shukwong/awesome_automl_libraries):
- [Rakib091998/Auto_ML](https://github.com/Rakib091998/Auto_ML)
- [DeepHiveMind/AutoML_AutoKeras_HPO](https://github.com/DeepHiveMind/AutoML_AutoKeras_HPO)
- [theainerd/automated-machine-learning](https://github.com/theainerd/automated-machine-learning):libraries
- [SIAN][awslabs/autogluon](https://github.com/awslabs/autogluon): AutoGluon automates machine learning tasks enabling you to easily achieve strong predictive performance in your applications. With just a few lines of code, you can train and deploy high-accuracy deep learning models on tabular, image, and text data
- [awslabs/adatune](https://github.com/awslabs/adatune): AdaTune is a library to perform gradient based hyperparameter tuning for training deep neural networks. AdaTune currently supports tuning of the learning_rate parameter but some of the methods implemented here can be extended to other hyperparameters like momentum or weight_decay etc. AdaTune provides the following gradient based hyperparameter tuning algorithms - HD, RTHO and our newly proposed algorithm, MARTHE. The repository also contains other commonly used non-adaptive learning_rate adaptation strategies like staircase-decay, exponential-decay and cosine-annealing-with-restarts. The library is implemented in PyTorch
- [pycaret/pycaret](https://github.com/pycaret/pycaret):PyCaret is an open source low-code machine learning library in Python that aims to reduce the hypothesis to insights cycle time in a ML experiment. It enables data scientists to perform end-to-end experiments quickly and efficiently. In comparison with the other open source machine learning libraries, PyCaret is an alternative low-code library that can be used to perform complex machine learning tasks with only few lines of code. PyCaret is essentially a Python wrapper around several machine learning libraries and frameworks such as scikit-learn, XGBoost, Microsoft LightGBM, spaCy and many more.
- [IN][microsoft/nni](https://github.com/microsoft/nni): An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning
- [SIAN][uber/ludwig](https://github.com/uber/ludwig): Ludwig is a toolbox built on top of TensorFlow that allows users to train and test deep learning models without the need to write code
- [FeatureLabs/Featuretools](https://github.com/FeatureLabs/featuretools): a good library for automatically engineering features from relational and transactional data
- [automl/auto-sklearn](https://github.com/automl/auto-sklearn): it's really a drop-in replacement for scikit-learn estimators.
- [automl/HPOlib2](https://github.com/automl/HPOlib2): HPOlib2 is a library for hyperparameter optimization and black box optimization benchmarks.
- [automl/Auto-Pytorch](https://github.com/automl/Auto-PyTorch): Automatic architecture search and hyperparameter optimization for PyTorch
- [automl/RoBO](https://github.com/automl/RoBO): RoBO uses the Gaussian processes library george and the random forests library pyrfr.
- [automl/Auto-WEKA](https://github.com/automl/autoweka): Repository for Auto-WEKA, wich provides automatic selection of models and hyperparameters for WEKA
- [automl/SMAC3](https://github.com/automl/SMAC3): SMAC is a tool for algorithm configuration to optimize the parameters of arbitrary algorithms across a set of instances. This also includes hyperparameter optimization of ML algorithms. The main core consists of Bayesian Optimization in combination with a aggressive racing mechanism to efficiently decide which of two configuration performs better
- [NVIDIA/Milano](https://github.com/NVIDIA/Milano): Milano (Machine learning autotuner and network optimizer) is a tool for enabling machine learning researchers and practitioners to perform massive hyperparameters and architecture searches
- [facebook/AX](https://ax.dev/);[github](https://github.com/facebook/Ax): Ax is an accessible, general-purpose platform for understanding, managing, deploying, and automating adaptive experiments. Adaptive experimentation is the machine-learning guided process of iteratively exploring a (possibly infinite) parameter space in order to identify optimal configurations in a resource-efficient manner. Ax currently supports Bayesian optimization and bandit optimization as exploration strategies. Bayesian optimization in Ax is powered by BoTorch, a modern library for Bayesian optimization research built on PyTorch
- [pytorch/BOTORCH](https://botorch.org);[github](https://github.com/pytorch/botorch): BoTorch is a library for Bayesian Optimization built on PyTorch
- [google-research/automl_zero](https://github.com/google-research/google-research/tree/master/automl_zero): AutoML-Zero aims to automatically discover computer programs that can solve machine learning tasks, starting from empty or random programs and using only basic math operations. The goal is to simultaneously search for all aspects of an ML algorithm—including the model structure and the learning strategy—while employing minimal human bias.
- [kubeflow/katib](https://github.com/kubeflow/katib): Katib is a Kubernetes-based system for Hyperparameter Tuning and Neural Architecture Search. Katib supports a number of ML frameworks, including TensorFlow, Apache MXNet, PyTorch, XGBoost, and others
- [I][keras-team/AutoKeras](https://github.com/keras-team/autokeras): An AutoML system based on Keras. It is developed by DATA Lab at Texas A&M University. The goal of AutoKeras is to make machine learning accessible for everyone
- [keras-team/keras-tuner](https://github.com/keras-team/keras-tuner): Hyperparameter tuning for humans
- [HDI-Project/AutobBzaar](https://github.com/HDI-Project/AutoBazaar): AutoBazaar is an AutoML system created using The Machine Learning Bazaar, a research project and framework for building ML and AutoML systems by the Data To AI Lab at MIT.
- [HDI-Project/BTB](https://github.com/HDI-Project/BTB): BTB ("Bayesian Tuning and Bandits") is a simple, extensible backend for developing auto-tuning systems such as AutoML systems. It provides an easy-to-use interface for tuning and selection
- [tensorflow/adanet](https://github.com/tensorflow/adanet): AdaNet is a lightweight TensorFlow-based framework for automatically learning high-quality models with minimal expert intervention. AdaNet builds on recent AutoML efforts to be fast and flexible while providing learning guarantees. Importantly, AdaNet provides a general framework for not only learning a neural network architecture, but also for learning to ensemble to obtain even better models
- [IBM/lale](https://github.com/IBM/lale): Lale is a Python library for semi-automated data science. Lale makes it easy to automatically select algorithms and tune hyperparameters of pipelines that are compatible with scikit-learn, in a type-safe fashion. If you are a data scientist who wants to experiment with automated machine learning, this library is for you! Lale adds value beyond scikit-learn along three dimensions: automation, correctness checks, and interoperability. For automation, Lale provides a consistent high-level interface to existing pipeline search tools including Hyperopt, GridSearchCV, and SMAC. For correctness checks, Lale uses JSON Schema to catch mistakes when there is a mismatch between hyperparameters and their type, or between data and operators. And for interoperability, Lale has a growing library of transformers and estimators from popular libraries such as scikit-learn, XGBoost, PyTorch etc. Lale can be installed just like any other Python package and can be edited with off-the-shelf Python tools such as Jupyter notebooks
- [CiscoAI/amla](https://github.com/CiscoAI/amla): AMLA is a common framework to run different AutoML algorithms for neural networks without changing the underlying systems needed to configure, train and evaluate the generated networks.
- [ARM-software/mango](https://github.com/ARM-software/mango): Mango is a python library for parallel optimization over complex search spaces. Currently, Mango is intended to find the optimal hyperparameters for machine learning algorithms. Check out the quick 12 seconds demo of Mango approximating a complex decision boundary of SVM
- [mindsdb/mindsdb](https://github.com/mindsdb/mindsdb): MindsDB is an Explainable AutoML framework for developers built on top of Pytorch. It enables you to build, train and test state of the art ML models in as simple as one line of code
- [EpistasisLab/TPOT](https://github.com/EpistasisLab/tpot): is using genetic programming to find the best performing ML pipelines, and it is built on top of scikit-learn
- [Neuraxio/Neuraxle](https://github.com/Neuraxio/Neuraxle): A Sklearn-like Framework for Hyperparameter Tuning and AutoML in Deep Learning projects. Finally have the right abstractions and design patterns to properly do AutoML. Let your pipeline steps have hyperparameter spaces. Enable checkpoints to cut duplicate calculations. Go from research to production environment easily.
- [deephyper/deephyper](https://github.com/deephyper/deephyper): DeepHyper is an automated machine learning (AutoML) package for deep neural networks. It comprises two components: 1) Neural architecture search is an approach for automatically searching for high-performing the deep neural network search_space. 2) Hyperparameter search is an approach for automatically searching for high-performing hyperparameters for a given deep neural network. DeepHyper provides an infrastructure that targets experimental research in neural architecture and hyperparameter search methods, scalability, and portability across HPC systems. It comprises three modules: benchmarks, a collection of extensible and diverse benchmark problems; search, a set of search algorithms for neural architecture search and hyperparameter search; and evaluators, a common interface for evaluating hyperparameter configurations on HPC platforms
- [dataloop-ai/zazuml](https://github.com/dataloop-ai/ZazuML): This is an easy open-source AutoML framework for object detection. Currently this project contains a model & hyper-parameter tuner, auto augmentations, trial manager and prediction trigger, already loaded with your top preforming model-checkpoint. A working pipeline ready to be plugged into your product, simple as that
- [Ashton-Sidhu/aethos](https://github.com/Ashton-Sidhu/aethos): Aethos is a library/platform that automates your data science and analytical tasks at any stage in the pipeline. Aethos is, at its core, a uniform API that helps automate analytical techniques from various libaries such as pandas, sci-kit learn, gensim, etc
- [hyperopt/Hyperopt-sklearn](https://github.com/hyperopt/hyperopt-sklearn): Hyperopt-sklearn is Hyperopt-based model selection among machine learning algorithms in scikit-learn.
- [SigOpt](https://sigopt.com/): SigOpt is a standardized, scalable, enterprise-grade optimization platform and API designed to unlock the potential of your modeling pipelines. This fully agnostic software solution accelerates, amplifies, and scales the model development process.
- [S][H2O-offical website](https://www.h2o.ai/); [H2O-github](https://github.com/h2oai): Open Source Fast Scalable Machine Learning Platform For Smarter Applications: Deep Learning, Gradient Boosting & XGBoost, Random Forest, Generalized Linear Modeling (Logistic Regression, Elastic Net), K-Means, PCA, Stacked Ensembles, Automatic Machine Learning (AutoML), etc
- [S][MLJAR](https://mljar.com/);[github](https://github.com/mljar/mljar-supervised): An Automated Machine Learning (AutoML) python package for tabular data. It can handle: Binary Classification, MultiClass Classification and Regression. It provides explanations and markdown reports.
- [autogoal/autogoal](https://github.com/autogoal/autogoal):AutoGOAL is a Python library for automatically finding the best way to solve a given task. It has been designed mainly for Automated Machine Learning (aka AutoML) but it can be used in any scenario where you have several possible ways (i.e., programs) to solve a given task
- [optuna/optuna](https://github.com/optuna/optuna): Optuna is an automatic hyperparameter optimization software framework, particularly designed for machine learning. It features an imperative, define-by-run style user API. Thanks to our define-by-run API, the code written with Optuna enjoys high modularity, and the user of Optuna can dynamically construct the search spaces for the hyperparameters
- [DataCanvasIO/Hypernets](https://github.com/DataCanvasIO/Hypernets):Hypernets is a general AutoML framework, based on which it can implement automatic optimization tools for various machine learning frameworks and libraries, including deep learning frameworks such as tensorflow, keras, pytorch, and machine learning libraries like sklearn, lightgbm, xgboost, etc. We introduced an abstract search space representation, taking into account the requirements of hyperparameter optimization and neural architecture search(NAS), making Hypernets a general framework that can adapt to various automated machine learning needs.
- [LGE-ARC-AdvancedAI/Auptimizer](https://github.com/LGE-ARC-AdvancedAI/auptimizer): Auptimizer is an optimization tool for Machine Learning (ML) that automates many of the tedious parts of the model building process. Currently, Auptimizer helps with:1)Automating tedious experimentation - Start using Auptimizer by changing just a few lines of your code. It will run and record sophisticated hyperparameter optimization (HPO) experiments for you, resulting in effortless consistency and reproducibility.2)
- [fmfn/BayesianOptimization](https://github.com/fmfn/BayesianOptimization): This is a constrained global optimization package built upon bayesian inference and gaussian process, that attempts to find the maximum value of an unknown function in as few iterations as possible. This technique is particularly suited for optimization of high cost functions, situations where the balance between exploration and exploitation is important
- [rmcantin/BayesOpt](https://github.com/rmcantin/bayesopt): BayesOpt is an efficient implementation of the Bayesian optimization methodology for nonlinear optimization, experimental design and hyperparameter tunning
- [Angle-ml/automl](https://github.com/Angel-ML/automl): Angel-AutoML provides automatic hyper-parameter tuning and feature engineering operators. It is developed with Scala. As a stand-alone library, Angel-AutoML can be easily integrated in Java and Scala projects.
- [auto-flow/auto-flow](https://github.com/auto-flow/auto-flow): automatic machine learning workflow modeling platform
- [scikit-optimize/Scikit-Optimize](https://github.com/scikit-optimize/scikit-optimize): Scikit-Optimize, or skopt, is a simple and efficient library to minimize (very) expensive and noisy black-box functions. It implements several methods for sequential model-based optimization. skopt aims to be accessible and easy to use in many contexts
- [cod3licious/autofeat](https://github.com/cod3licious/autofeat): Linear Prediction Models with Automated Feature Engineering and Selection
- [S][Alex-Lekov/AutoML_Alex](https://github.com/Alex-Lekov/AutoML_Alex): State-of-the art Automated Machine Learning python library for Tabular Data
- [joeddav/DEvol](https://github.com/joeddav/devol): DEvol (DeepEvolution) is a basic proof of concept for genetic architecture search in Keras. The current setup is designed for classification problems, though this could be extended to include any other output type as well.:DEvol (DeepEvolution) is a basic proof of concept for genetic architecture search in Keras. The current setup is designed for classification problems, though this could be extended to include any other output type as well.
- [AutoViML/auto_ts](https://github.com/AutoViML/Auto_TS): auto-ts is an Automated ML library for time series data. auto-ts enables you to build and select multiple time series models using techniques such as ARIMA, SARIMAX, VAR, decomposable (trend+seasonality+holidays) models, and ensemble machine learning models.
- [gfluz94/aautoml-gfluz](https://github.com/gfluz94/automl-gfluz): This is a library developed to incorporate useful properties and methods in relevant data science packages, such as scikit-learn and pycaret, in order to provide a pipeline which suits every supervised problem. Therefore, data scientists can spend less time working on building pipelines and use this time more wisely to create new features and tune the best model.
- [societe-generale/aikit](https://github.com/societe-generale/aikit): Automatic Tool Kit for Machine Learning and Datascience. The objective is to provide tools to ease the repetitive part of the DataScientist job and so that he/she can focus on modelization. This package is still in alpha and more features will be added
- [SoftwareAG/mlw](https://github.com/SoftwareAG/MLW): ML Workbench is an open source machine learning and artificial intelligence platform for Data Scientist to solve business problems faster and quicker, build prototypes and convert them to actual project. The modeler helps from data preparation to model building and deployment, the tool supports a large variety of algorithms that can be run without a single line of code. The web based tool has various components which help Data Scientist of different skill levels to perfrom several model building tasks and provides deployment ready PMML files which can be hosted as a REST services. ML Workbench allows it's user to cover a wide variety of algorithms and Deep Neural Network architectures, with minimal or No code enviornment. It is also one of the few deep-learning platforms to support the Predictive Model Markup Languaue (PMML) format, PMML allows for different statistical and data mining tools to speak the same language.
- [souryadey/deep-n-cheap](https://github.com/souryadey/deep-n-cheap): This repository implements Deep-n-Cheap – an AutoML framework to search for deep learning models
- [deil87/automl-genetic](https://github.com/deil87/automl-genetic): Here we are trying to employ evolutionary algorithms and concepts to search the space of classifiers. In particularly we are interested in automatic construction of ensembles of classifiers because nowadays they have proved to be very efficient
- [CleverInsight/cognito](https://github.com/CleverInsight/cognito): Cognito is an exclusive python data preprocessing library and command line utility that helps any developer to transform raw data into a machine-learning format. We at CleverInsight Open Ai Foundation took the initiative to build a better automated data preprocessing library and here it is
- [kxsystems/automl](https://github.com/KxSystems/automl): The automated machine learning library described here is built largely on the tools available within the machine learning toolkit available here. The purpose of this framework is to provide users with the ability to automate the process of applying machine learning techniques to real-world problems. In the absence of expert machine learning engineers this handles the following processes within a traditional workflow
- [Media-Smart/volkstuner](https://github.com/Media-Smart/volkstuner): volkstuner is an open source hyperparameter tuner.
- [mihaianton/automl](https://github.com/MihaiAnton/AutoML): An automated Machine Learning pipeline for faster Data Science projects. Using Evolutionary Algorithms for Neural Architecture Search and State-Of-The-Art data engineering techniques towards building an off the box machine learning solution
- [epeters3/skplumber](https://github.com/epeters3/skplumber): An AutoML tool and lightweight ML framework for Scikit-Learn.Making the best use of your compute-resources - Whether you are using a couple of GPUs or AWS, Auptimizer will help you orchestrate compute resources for faster hyperparameter tuning.3)Getting the best models in minimum time - Generate optimal models and achieve better performance by employing state-of-the-art HPO techniques. Auptimizer provides a single seamless access point to top-notch HPO algorithms, including Bayesian optimization, multi-armed bandit. You can even integrate your own proprietary solution.
- [tristandeleu/pytorch-meta](https://github.com/tristandeleu/pytorch-meta): A collection of extensions and data-loaders for few-shot learning & meta-learning in PyTorch. Torchmeta contains popular meta-learning benchmarks, fully compatible with both torchvision and PyTorch's DataLoader
- [learnables/learn2learn](https://github.com/learnables/learn2learn): PyTorch Meta-learning Library for Researchers
- [dragonfly/Dragonfly](https://github.com/dragonfly/dragonfly): An open source python library for scalable Bayesian optimisation.
- [starlibs/AILibs](https://github.com/starlibs/AILibs):AILibs is a modular collection of Java libraries related to automated decision making. It's highlight functionalities are:1)Graph Search (jaicore-search): (AStar, BestFirst, Branch & Bound, DFS, MCTS, and more);2)Logic (jaicore-logic): Represent and reason about propositional and simple first order logic formulas;3)Planning (jaicore-planning): State-space planning (STRIPS, PDDL), and hierarchical planning (HTN, ITN, PTN);4)Reproducible Experiments (jaicore-experiments): Design and efficiently conduct experiments in a highly parallelized manner.;5)Automated Software Configuration (HASCO): Hierarchical configuration of software systems.;6)Automated Machine Learning (ML-Plan): Automatically find optimal machine learning pipelines in WEKA or sklearn
- [societe-generale/aikit](https://github.com/societe-generale/aikit):Automatic Tool Kit for Machine Learning and Datascience. The objective is to provide tools to ease the repetitive part of the DataScientist job and so that he/she can focus on modelization. This package is still in alpha and more features will be added. Its mains features are:1)improved and new "scikit-learn like" transformers ;2)GraphPipeline : an extension of sklearn Pipeline that handles more generic chains of tranformations ;3)an AutoML to automatically search throught several transformers and models.
- [PGijsbers/gama](https://github.com/PGijsbers/gama):GAMA is an AutoML package for end-users and AutoML researchers. It generates optimized machine learning pipelines given specific input data and resource constraints. A machine learning pipeline contains data preprocessing (e.g. PCA, normalization) as well as a machine learning algorithm (e.g. Logistic Regression, Random Forests), with fine-tuned hyperparameter settings (e.g. number of trees in a Random Forest).To find these pipelines, multiple search procedures have been implemented. GAMA can also combine multiple tuned machine learning pipelines together into an ensemble, which on average should help model performance. At the moment, GAMA is restricted to classification and regression problems on tabular data. In addition to its general use AutoML functionality, GAMA aims to serve AutoML researchers as well. During the optimization process, GAMA keeps an extensive log of progress made. Using this log, insight can be obtained on the behaviour of the search procedure.
- [S][BartekPog/modelcreator](https://github.com/BartekPog/modelcreator): Simple python package for creating predictive models.This package contains a Machine which is meant to do the learning for you. It can automaticly create a fitting predictive model for given data
- [microsoft/EconML](https://github.com/microsoft/EconML):EconML is a Python package for estimating heterogeneous treatment effects from observational data via machine learning. This package was designed and built as part of the ALICE project at Microsoft Research with the goal to combine state-of-the-art machine learning techniques with econometrics to bring automation to complex causal inference problems. The promise of EconML:1)Implement recent techniques in the literature at the intersection of econometrics and machine learning;2)Maintain flexibility in modeling the effect heterogeneity (via techniques such as random forests, boosting, lasso and neural nets), while preserving the causal interpretation of the learned model and often offering valid confidence intervals;3)Use a unified API;4)Build on standard Python packages for Machine Learning and Data Analysis.
- 【Commercial】[AutoCross](https://www.4paradigm.com/): 第四范式
- [Yelp/MOE](https://github.com/Yelp/MOE): MOE (Metric Optimization Engine) is an efficient way to optimize a system's parameters, when evaluating parameters is time-consuming or expensive
- [flytxtds/AutoGBT](https://github.com/flytxtds/AutoGBT): AutoGBT stands for Automatically Optimized Gradient Boosting Trees, and is used for AutoML in a lifelong machine learning setting to classify large volume high cardinality data streams under concept-drift. AutoGBT was developed by a joint team ('autodidact.ai') from Flytxt, Indian Institute of Technology Delhi and CSIR-CEERI as a part of NIPS 2018 AutoML Challenge (The 3rd AutoML Challenge: AutoML for Lifelong Machine Learning).
- [MainRo/xgbtune](https://github.com/MainRo/xgbtune): XGBTune is a library for automated XGBoost model tuning. Tuning an XGBoost model is as simple as a single function call.
- [autonomio/talos](https://github.com/autonomio/talos): alos radically changes the ordinary Keras workflow by fully automating hyperparameter tuning and model evaluation. Talos exposes Keras functionality entirely and there is no new syntax or templates to learn.
- [HunterMcGushion/hyperparameter_hunter](https://github.com/HunterMcGushion/hyperparameter_hunter): Automatically save and learn from Experiment results, leading to long-term, persistent optimization that remembers all your tests. HyperparameterHunter provides a wrapper for machine learning algorithms that saves all the important data. Simplify the experimentation and hyperparameter tuning process by letting HyperparameterHunter do the hard work of recording, organizing, and learning from your tests — all while using the same libraries you already do. Don't let any of your experiments go to waste, and start doing hyperparameter optimization the way it was meant to be
- [ja-thomas/autoxgboost](https://github.com/ja-thomas/autoxgboost):autoxgboost aims to find an optimal xgboost model automatically using the machine learning framework mlr and the bayesian optimization framework mlrMBO.
- [ScottfreeLLC/AlphaPy](https://github.com/ScottfreeLLC/AlphaPy): AlphaPy is a machine learning framework for both speculators and data scientists. It is written in Python with the scikit-learn, pandas, and Keras libraries, as well as many other helpful libraries for feature engineering and visualization
- [gdikov/hypertunity](https://github.com/gdikov/hypertunity): A toolset for black-box hyperparameter optimisation.
- [laic-ufmg/recipe](https://github.com/laic-ufmg/Recipe): Automated machine learning (AutoML) with grammar-based genetic programming
- [thomas-young-2013/alpha-ml](https://github.com/thomas-young-2013/alpha-ml): Alpha-ML is a high-level AutoML toolkit, written in Python
- [produvia/ai-platform](https://github.com/produvia/ai-platform):AI Platform aims to automate AI R&D tasks. Our vision is to create machine learning models to solve various computer science tasks. Our mission is to achieve automation of AI technologies.We are developing service-centered or task-focused machine learning models. These models, or AI services, solve distinct tasks or functions.Examples of AI tasks include:1)semantic segmentation (computer visions);2)machine translation (natural language processing);3)word embeddings (methodology);4)recommendation systems (miscellaneous);5)speech recognition (speech);6)atari games (playing games);7)link prediction (graphs);8)time series classification (time series);9)audio generation (audio);10)visual odometry (robots);11)music information retrieval (music);12)dimensionality reduction (computer code);13)decision making (reasoning);14)knowledge graphs (knowledge base);15)adversarial attack (adversarial).
- [wywongbd/autocluster](https://github.com/wywongbd/autocluster): autocluster is an automated machine learning (AutoML) toolkit for performing clustering tasks
- [ksachdeva/scikit-nni](https://github.com/ksachdeva/scikit-nni): Microsoft NNI (Neural Network Intelligence) is a toolkit to help users run automated machine learning (AutoML) experiments. The tool dispatches and runs trial jobs generated by tuning algorithms to search the best neural architecture and/or hyper-parameters in different environments like local machine, remote servers and cloud
- [SaltWaterStudio/modgen](https://github.com/SaltWaterStudio/modgen): This program was created for rapid feature engineering without the need to optimize each model. Modgen is designed to develop a quick overview of how your updated features will react to each model. You can use one specific algorithm or a wide variety (depending on your interests) with a random feature range which can be easily changed at anytime by the user
- [gomerudo/automl](https://github.com/gomerudo/auto-ml): The Automated Machine Learning process is intented to automatically discover well performant pipelines that solve a machine learning problem such as classification or regression
- [crawles/automl_service](https://github.com/crawles/automl_service): Deploy automated machine learning (AutoML) as a service using Flask, for both pipeline training and pipeline serving. The framework implements a fully automated time series classification pipeline, automating both feature engineering and model selection and optimization using Python libraries, TPOT and tsfresh
- [georgianpartners/foreshadow](https://github.com/georgianpartners/foreshadow): Foreshadow is an automatic pipeline generation tool that makes creating, iterating, and evaluating machine learning pipelines a fast and intuitive experience allowing data scientists to spend more time on data science and less time on code
- [ypeleg/HungaBunga](https://github.com/ypeleg/HungaBunga): Brute Force all scikit-learn models and all scikit-learn parameters with fit predict
- [onepanelio/automl](https://github.com/onepanelio/automl): Onepanel's AutoML framework was built to improve the accuracy of your machine learning models and make them more accessible by automatically creating a data analysis pipeline that can include data pre-processing, feature selection, and feature engineering methods along with machine learning methods and parameter settings that are optimized for your data
- [accurat/ackeras](https://github.com/accurat/ackeras): AutoML library for Accurat, based on AutoKeras and Scikit-Learn
- [bhat-prashant/reinforceML](https://github.com/bhat-prashant/reinforceML): A handy Data Science Assistant for beginners and exerts alike. ReinforceML is a Python Automated Machine Learning tool that optimizes machine learning pipelines using genetic programming and reinforcement learning
- [reiinakano/Xcessive](https://github.com/reiinakano/xcessiv): A web-based application for quick, scalable, and automated hyperparameter tuning and stacked ensembling in Python
- [minimaxir/automl-gs](https://github.com/minimaxir/automl-gs): automl-gs is an AutoML tool which, unlike Microsoft's NNI, Uber's Ludwig, and TPOT, offers a zero code/model definition interface to getting an optimized model and data transformation pipeline in multiple popular ML/DL frameworks, with minimal Python dependencies (pandas + scikit-learn + your framework of choice). automl-gs is designed for citizen data scientists and engineers without a deep statistical background under the philosophy that you don't need to know any modern data preprocessing and machine learning engineering techniques to create a powerful prediction workflow
- [cc-hpc-itwm/PHS](https://github.com/cc-hpc-itwm/PHS): phs is an ergonomic framework for performing hyperparameter searches on numerous cumpute instances of any arbitrary python function. This is achieved with minimal modifications inside your target function. Possible applications appear in expensive to evaluate numerical computations which strongly depend on hyperparameters such as machine learning
- [tobegit3hub/advisor](https://github.com/tobegit3hub/advisor): Advisor is the hyper parameters tuning system for black box optimization
- [HIPS/Spearmint](https://github.com/HIPS/Spearmint): Spearmint is a software package to perform Bayesian optimization. The Software is designed to automatically run experiments (thus the code name spearmint) in a manner that iteratively adjusts a number of parameters so as to minimize some objective in as few runs as possible
- [claesenm/Optunity](https://github.com/claesenm/optunity): Optunity is a library containing various optimizers for hyperparameter tuning. Hyperparameter tuning is a recurrent problem in many machine learning tasks, both supervised and unsupervised. Tuning examples include optimizing regularization or kernel parameters
- [cmccarthy1/automl](https://github.com/cmccarthy1/automl):The automated machine learning library described here is built largely on the tools available within the machine learning toolkit. The purpose of this framework is to provide users with the ability to automate the process of applying machine learning techniques to real-world problems. In the absence of expert machine learning engineers this handles the following processes within a traditional workflow
- [zygmuntz/HyperBand](https://github.com/zygmuntz/hyperband): The goal is to provide a fully functional implementation of Hyperband, as well as a number of ready to use functions for a number of models (classifiers and regressors)
- [ClimbsRocks/auto_ml](https://github.com/ClimbsRocks/auto_ml):Automates the whole machine learning process, making it super easy to use for both analytics, and getting real-time predictions in production.A quick overview of buzzwords, this project automates:1)Analytics (pass in data, and auto_ml will tell you the relationship of each variable to what it is you're trying to predict).2)Feature Engineering (particularly around dates, and NLP).3)Robust Scaling (turning all values into their scaled versions between the range of 0 and 1, in a way that is robust to outliers, and works with sparse data).4)Feature Selection (picking only the features that actually prove useful).5)Data formatting (turning a DataFrame or a list of dictionaries into a sparse matrix, one-hot encoding categorical variables, taking the natural log of y for regression problems, etc).6)Model Selection (which model works best for your problem- we try roughly a dozen apiece for classification and regression problems, including favorites like XGBoost if it's installed on your machine).7)Hyperparameter Optimization (what hyperparameters work best for that model).8)Big Data (feed it lots of data- it's fairly efficient with resources).9)Unicorns (you could conceivably train it to predict what is a unicorn and what is not).10)Ice Cream (mmm, tasty...).11)Hugs (this makes it much easier to do your job, hopefully leaving you more time to hug those those you care about).
- [jgreenemi/Parris](https://github.com/jgreenemi/Parris):Parris is a tool for automating the training of machine learning algorithms. If you're the kind of person that works on ML algorithms and spends too much time setting up a server to run it on, having to log into it to monitor its progress, etc., then you will find this tool helpful. No need to SSH into instances to get your training jobs done
- [ziyuw/rembo](https://github.com/ziyuw/rembo): Bayesian optimization in high-dimensions via random embedding.
- [kootenpv/xtoy](https://github.com/kootenpv/xtoy):Automated Machine Learning: go from 'X' to 'y' without effort
- [jesse-toftum/cash_ml](https://github.com/jesse-toftum/cash_ml):Automates the whole machine learning process, making it super easy to use for both analytics, and getting real-time predictions in production
- [CCQC/PES-Learn](https://github.com/CCQC/PES-Learn):PES-Learn is a Python library designed to fit system-specific Born-Oppenheimer potential energy surfaces using modern machine learning models. PES-Learn assists in generating datasets, and features Gaussian process and neural network model optimization routines. The goal is to provide high-performance models for a given dataset without requiring user expertise in machine learning.
- [AlexIoannides/ml-workflow-automation](https://github.com/AlexIoannides/ml-workflow-automation):Python Machine Learning (ML) project that demonstrates the archetypal ML workflow within a Jupyter notebook, with automated model deployment as a RESTful service on Kubernetes.
- [yeticloud/dama](https://github.com/yeticloud/dama):a simplified machine learning container platform that helps teams get started with an automated workflow
- [lai-bluejay/diego](https://github.com/lai-bluejay/diego):Diego: Data in, IntElliGence Out. A fast framework that supports the rapid construction of automated learning tasks. Simply create an automated learning study (Study) and generate correlated trials (Trial). Then run the code and get a machine learning model. Implemented using Scikit-learn API glossary, using Bayesian optimization and genetic algorithm.
- 【Commercial】[DarwinML](http://iqubic.net/): 探智立方
- 【Commercial】[Cloud AutoML](https://cloud.google.com/automl/):
- 【Commercial】[MateLabs](http://matelabs.ai/):
- 【Commercial】[DataRobot](https://www.datarobot.com/): Learn from an all-star lineup of expert speakers how to best leverage AI today to build business resilience, reduce costs, and speed time to results
- [mb706/automlr](https://github.com/mb706/automlr): automlr is an R-package for automatically configuring mlr machine learning algorithms so that they perform well. It is designed for simplicity of use and able to run with minimal user intervention
- [XanderHorn/autoML](https://github.com/XanderHorn/autoML):Automated machine learning in R
- [DataSystemsGroupUT/SmartML](https://github.com/DataSystemsGroupUT/SmartML): SmartML is an R-Package representing a meta learning-based framework for automated selection and hyperparameter tuning for machine learning algorithms. Being meta-learning based, the framework is able to simulate the role of the machine learning expert. In particular, the framework is equipped with a continuously updated knowledge base that stores information about the meta-features of all processed datasets along with the associated performance of the different classifiers and their tuned parameters. Thus, for any new dataset, SmartML automatically extracts its meta features and searches its knowledge base for the best performing algorithm to start its optimization process. In addition, SmartML makes use of the new runs to continuously enrich its knowledge base to improve its performance and robustness for future runs
- [PaddlePaddle/AutoDL](https://github.com/PaddlePaddle/AutoDL):The open-sourced AutoDl Design is one implementation of AutoDL technique.
- [linxihui/lazyML](https://github.com/linxihui/lazyML): An R package aims to automatically select models and tune parameters, built upon the popular package caret. The main function mpTune can tune hyper-parameters of a list of models simultaneously with parallel support. It also has functionality to give an unbiased performance estimate of the mpTune procedure. Currently, classification, regression and survival models are supported.
- [darvis-ai/Brainless](https://github.com/darvis-ai/Brainless):Automated Machine Learning Library Using Random Search and Cash Technique.
- [r-tensorflow/autokeras](https://github.com/r-tensorflow/autokeras): AutoKeras is an open source software library for automated machine learning (AutoML). It is developed by DATA Lab at Texas A&M University and community contributors. The ultimate goal of AutoML is to provide easily accessible deep learning tools to domain experts with limited data science or machine learning background. AutoKeras provides functions to automatically search for architecture and hyperparameters of deep learning models.
- [IBM/AutoMLPipline.jl](https://github.com/IBM/AutoMLPipeline.jl): is a package that makes it trivial to create complex ML pipeline structures using simple expressions. It leverages on the built-in macro programming features of Julia to symbolically process, manipulate pipeline expressions, and makes it easy to discover optimal structures for machine learning prediction and classification.
- [SciML/ModelingToolkit.jl](https://github.com/SciML/ModelingToolkit.jl):A modeling framework for automatically parallelized scientific machine learning (SciML) in Julia. A computer algebra system for integrated symbolics for physics-informed machine learning and automated transformations of differential equations
- [SciML/DataDrivenDiffEq.jl](https://github.com/SciML/DataDrivenDiffEq.jl):Data driven modeling and automated discovery of dynamical systems for the SciML Scientific Machine Learning organization
- [ClimbsRocks/machineJS](https://github.com/ClimbsRocks/machineJS):a fully-featured default process for machine learning- all the parts are here and have functional default values in place. Modify to your heart's delight so you can focus on the important parts for your dataset, or run it all the way through with the default values to have fully automated machine learning
- [automl-js/automl-js](https://github.com/automl-js/automl-js):Automated Machine Learning, done locally in browser or on a server with nodejs. Ground up implementation of ML algorithms for both regression and classification, such as Decision Trees, Linear Models and Gradient Boosting with Decision Trees. The implementation is benchmarked against excellent scikit-learn library to give quite close, albeit somewhat smaller (at most 1 percent of classification accuracy on average) score.
- [duckladydinh/KotlinML](https://github.com/duckladydinh/KotlinML)
- [paper][AutoStacker](https://arxiv.org/abs/1803.00684):
- [paper][AlphaD3M](https://www.cs.columbia.edu/~idrori/AlphaD3M.pdf):
- [paper][VDS](https://confer.csail.mit.edu/sigmod2019/papers):
- [paper][ExploreKit](https://people.eecs.berkeley.edu/~dawnsong/papers/icdm-2016.pdf):### Distributed Frameworks
- [intel-analytics/analytics-zoo](https://github.com/intel-analytics/analytics-zoo): Analytics Zoo seamless scales TensorFlow, Keras and PyTorch to distributed big data (using Spark, Flink & Ray).
- [databricks/automl-toolkit](https://github.com/databrickslabs/automl-toolkit): This package provides a number of different levels of API interaction, from the highest-level "default only" FamilyRunner to low-level APIs that allow for highly customizable workflows to be created for automated ML tuning and Inference
- [salesforce/TransmogrifAI](https://github.com/salesforce/TransmogrifAI): TransmogrifAI (pronounced trăns-mŏgˈrə-fī) is an AutoML library written in Scala that runs on top of Apache Spark. It was developed with a focus on accelerating machine learning developer productivity through machine learning automation, and an API that enforces compile-time type-safety, modularity, and reuse. Through automation, it achieves accuracies close to hand-tuned models with almost 100x reduction in time.
- [hyperopt/Hyperopt](https://github.com/hyperopt/hyperopt): Distributed Asynchronous Hyperparameter Optimization in Python, for serial and parallel optimization over awkward search spaces, which may include real-valued, discrete, and conditional dimensions.
- [nusdbsystem/singa-auto](https://github.com/nusdbsystem/singa-auto):SINGA-AUTO is a distributed system that trains machine learning (ML) models and deploys trained models, built with ease-of-use in mind. To do so, it leverages on automated machine learning (AutoML).
- [DataSystemsGroupUT/D-SmartML](https://github.com/DataSystemsGroupUT/Distributed-SmartML): An automated Machine Learning pipeline for faster Data Science projects. Using Evolutionary Algorithms for Neural Architecture Search and State-Of-The-Art data engineering techniques towards building an off the box machine learning solution.
- [UCBerkeley/MLBase](http://www.mlbase.org/): Implementing and consuming Machine Learning at scale are difficult tasks. MLbase is a platform addressing both issues, and consists of three components -- MLlib, MLI, ML Optimizer. 1)ML Optimizer: This layer aims to automating the task of ML pipeline construction. The optimizer solves a search problem over feature extractors and ML algorithms included in MLI and MLlib. The ML Optimizer is currently under active development.2)MLI: An experimental API for feature extraction and algorithm development that introduces high-level ML programming abstractions. A prototype of MLI has been implemented against Spark, and serves as a testbed for MLlib.3)MLlib: Apache Spark's distributed ML library. MLlib was initially developed as part of the MLbase project, and the library is currently supported by the Spark community. Many features in MLlib have been borrowed from ML Optimizer and MLI, e.g., the model and algorithm APIs, multimodel training, sparse data support, design of local / distributed matrices, etc.
- 【Commercial】[Databricks/AutoML](https://databricks.com/product/automl-on-databricks#resource-link): The library receive dataset as an input and produce an optimized model as an output. The library extracts some characteristics of the datasets and use an internal knowledgebase to determine the best algorithm, then use a hyperband method to find the best hyper parameters for the selected algorithm.
- [AxeldeRomblay/MLBox](https://github.com/AxeldeRomblay/MLBox): is another AutoML library and it supports distributed data processing, cleaning, formatting, and state-of-the-art algorithms such as LightGBM and XGBoost. It also supports model stacking, which allows you to combine an information ensemble of models to generate a new model aiming to have better performance than the individual models.
- [HDI-Project/ATM](https://github.com/HDI-Project/ATM): Auto Tune Models (ATM) is an AutoML system designed with ease of use in mind. In short, you give ATM a classification problem and a dataset as a CSV file, and ATM will try to build the best model it can. ATM is based on a paper of the same name, and the project is part of the Human-Data Interaction (HDI) Project at MIT.
- [HDI-Project/ATMSeer](https://github.com/HDI-Project/ATMSeer): ATMSeer is an interactive visualization tool for automated machine learning (AutoML). It supports users to monitor an ongoing AutoML process, analyze the searched models, and refine the search space in real-time through a multi-granularity visualization. In this instantiation, we build on top of the ATM AutoML system
- [logicalclocks/maggy](https://github.com/logicalclocks/maggy): Maggy is a framework for efficient asynchronous optimization of expensive black-box functions on top of Apache Spark. Compared to existing frameworks, maggy is not bound to stage based optimization algorithms and therefore it is able to make extensive use of early stopping in order to achieve efficient resource utilization.
- [automl/HpBandSter](https://github.com/automl/HpBandSter):a distributed Hyperband implementation on Steroids
- [giantcroc/featuretoolsOnSpark](https://github.com/giantcroc/featuretoolsOnSpark): Featuretools is a python library for automated feature engineering. This repo is a simplified version of featuretools,using automatic feature generation framework of featuretools.Instead of the fussy back-end architecture of featuretools,We mainly use Spark DataFrame to achieve faster feature generation process(speed up 10x+)
- [automl/bohb](https://www.automl.org/automl/bohb/): a distributed Hyperband implementation on Steroids. This python 3 package is a framework for distributed hyperparameter optimization. It started out as a simple implementation of Hyperband (Li et al. 2017), and contains an implementation of BOHB (Falkner et al. 2018)
- [ray-project/ray](https://github.com/ray-project/ray): A fast and simple framework for building and running distributed applications. Ray is packaged with RLlib, a scalable reinforcement learning library, and Tune, a scalable hyperparameter tuning library
- [tqichun/distributed-SMAC3](https://github.com/tqichun/distributed-SMAC3): Distributed Sequential Model-based Algorithm Configuration, forked by https://github.com/automl/SMAC3 This package is a re-implementation of the original SMAC tool (see reference below). However, the reimplementation slightly differs from the original SMAC. For comparisons against the original SMAC, we refer to a stable release of SMAC (v2) in Java which can be found [here](http://www.cs.ubc.ca/labs/beta/Projects/SMAC/).
- [ccnt-glaucus/glaucus](https://github.com/ccnt-glaucus/glaucus/blob/master/README_CN.md): Glaucus is a Data Flow based machine learning suite that incorporates Automated machine learning pipeline, Simplified the complex processes of machine learning algorithms and applying Excellent distributed data-processing engines. For the non-data science professionals across the domain, help them get the benefits of powerful machine learning tools by a simple way.Our platform integrates many excellent data processing engines including Spark, Tensorflow, Scikit-learn, and we established a set of easy-to-use design process bases on them. The user only need to upload data, simple configuration, algorithm selection, and train the algorithm by automatic or manual parameter adjustment. The platform also provides a wealth of evaluation indicators for the training model, so that non-professionals can maximize the role of machine learning in their fields.
- [pierre-chaville/automlk](https://github.com/pierre-chaville/automlk): Automated and distributed machine learning toolkit. This toolkit is designed to be integrated within a python project, but also independently through the interface of the app
- [takezoe/predictionio-template-automl](https://github.com/takezoe/predictionio-template-automl): This is a Apache PredictionIO engine template which offers AutoML capability using TransmogrifAI.You can launch a prediction WebAPI service without any coding
- [nginyc/rafiki](https://github.com/nginyc/rafiki): Rafiki is a distributed system that trains machine learning (ML) models and deploys trained models, built with ease-of-use in mind. To do so, it leverages on automated machine learning (AutoML).## Projects
- [DeepWisdom/AutoDL-cn](https://github.com/DeepWisdom/AutoDL)
- [AI-HPC-Research-Team/AAH](https://github.com/AI-HPC-Research-Team/AAH):Automated Machine Learning as an AI-HPC benchmark
- [xiaomi-automl/fairdarts](https://github.com/xiaomi-automl/FairDARTS): This is the official implementation of the FairDARTS paper
- [RasaHQ/rasa](https://github.com/RasaHQ/rasa):Rasa is an open source machine learning framework to automate text-and voice-based conversations. With Rasa, you can build contextual assistants on:1)Facebook Messenger;2)Slack;3)Google Hangouts;4)Webex Teams;5)Microsoft Bot Framework;6)Rocket.Chat;7)Mattermost;8)Telegram;9)Twilio;Your own custom conversational channels or voice assistants as:1)Alexa Skills;2;2)Google Home Actions
- [databricks/automl-toolkit](https://github.com/databrickslabs/automl-toolkit): This package provides a number of different levels of API interaction, from the highest-level "default only" FamilyRunner to low-level APIs that allow for highly customizable workflows to be created for automated ML tuning and Inference
- [google-research/morphnet](https://github.com/google-research/morph-net): MorphNet is a method for learning deep network structure during training. The key principle is continuous relaxation of the network-structure learning problem. Specifically, activation sparsity is induced by adding regularizers that target the consumption of specific resources such as FLOPs or model size. When the regularizer loss is added to the training loss and their sum is minimized via stochastic gradient descent or a similar optimizer, the learning problem becomes a constrained optimization of the structure of the network, under the constraint represented by the regularizer. The method was first introduced in our CVPR 2018, paper "MorphNet: Fast & Simple Resource-Constrained Learning of Deep Network Structure".
- [kakaobrain/fast-autoaugment](https://github.com/kakaobrain/fast-autoaugment): Official Implementation of 'Fast AutoAugment' in PyTorch.
- [naszilla/bananas](https://github.com/naszilla/bananas): BANANAS is a neural architecture search (NAS) algorithm which uses Bayesian optimization with a meta neural network to predict the validation accuracy of neural architectures. We use a path-based encoding scheme to featurize the neural architectures that are used to train the neural network model. After training on just 200 architectures, we are able to predict the validation accuracy of new architectures to within one percent on average. The full NAS algorithm beats the state of the art on the NASBench and the DARTS search spaces. On the NASBench search space, BANANAS is over 100x more efficient than random search, and 3.8x more efficent than the next-best algorithm we tried. On the DARTS search space, BANANAS finds an architecture with a test error of 2.57%.
- [quark0/DARTS](https://github.com/quark0/darts): Differentiable architecture search for convolutional and recurrent networks. PyTorch 0.4 is not supported at this moment and would lead to OOM
- [microsoft/petridishnn](https://github.com/microsoft/petridishnn): Code for the neural architecture search methods contained in the paper Efficient Forward Neural Architecture Search
- [mit-han-lab/once-for-all](https://github.com/mit-han-lab/once-for-all)
- [NoamRosenberg/autodeeplab](https://github.com/NoamRosenberg/autodeeplab)
- [microsoft/forecasting](https://github.com/microsoft/forecasting)
- [nextml/NEXT](https://github.com/nextml/NEXT):
- [developers-cosmos/ML-CICD-GitHubActions](https://github.com/developers-cosmos/ML-CICD-GitHubActions):You can automate the process of building, testing, delivering, or deploying your Machine Learning models into production using GitHub Actions
- [lightforever/mlcomp](https://github.com/lightforever/mlcomp)
- [zhengying-liu/autodl](https://github.com/zhengying-liu/autodl):A machine learning competition in Automated Deep Learning (AutoDL), co-organized by ChaLearn, Google and 4Paradigm. Accepted at NeurIPS 2019.
- [e2its/gdayf-core](https://github.com/e2its/gdayf-core)
- [AutoViML/AutoViz](https://github.com/AutoViML/AutoViz)
- [AutoViML/Auto_ViML_WHO](https://github.com/AutoViML/Auto_ViML_WHO):Automated Variant Interpretable Machine Learning project with Hyper Opt (WHO). Build Multiple, Interpretable, ML Models Fast. Now using Hyper Opt
- [paypal/autskearn-zeroconf](https://github.com/paypal/autosklearn-zeroconf)
- [mikewlange/KETTLE](https://github.com/mikewlange/KETTLE)
- [dstreamsai/AutoML](https://github.com/dstreamsai/AutoML)
- [Yatoom/Optimus](https://github.com/Yatoom/Optimus)
- [MihaiAnton/AutoML](https://github.com/MihaiAnton/AutoML)
- [loaiabdalslam/AUL](https://github.com/loaiabdalslam/AUL)
- [AlexImb/automl-streams](https://github.com/AlexImb/automl-streams)
- [Jwuthri/Mozinor](https://github.com/Jwuthri/Mozinor)
- [kakaobrain/autoclint](https://github.com/kakaobrain/autoclint)
- [cmusatyalab/opentpod](https://github.com/cmusatyalab/OpenTPOD)
- [pfnet-research/autpgbt-alt](https://github.com/pfnet-research/autogbt-alt)
- [arberzela/efficientnas](https://github.com/arberzela/EfficientNAS)
- [positron1/amlb](https://github.com/positron1/amlb)
- [ealcobaca/pymfe](https://github.com/ealcobaca/pymfe)
- [TAMU-VITA/autospeech](https://github.com/TAMU-VITA/AutoSpeech)
- [u1234x1234/AutoSpeech2020](https://github.com/u1234x1234/AutoSpeech2020):1st place solution to Automated Machine Learning https://www.automl.ai/competitions/2
- [mittajithendra/Automated-Machine-Learning](https://github.com/mittajithendra/Automated-Machine-Learning):This project deals with only supervised learning problems. This project automatically do the job of a data scientist upto developing of a models.This project is designed on basis of different type of datasets
- [aiorhiroki/farmer](https://github.com/aiorhiroki/farmer):You can train Classification and Segmentation tasks semi-automatically
- [NCC-dev/farmer](https://github.com/NCC-dev/farmer):You can train Classification and Segmentation tasks as best practice
- [java][fmohr/AILibs](https://github.com/fmohr/AILibs)
- [DAI-Lab/cardea](https://github.com/DAI-Lab/Cardea)
- [datamllab/autokaggle](https://github.com/datamllab/autokaggle)
- [chasedehan/diaml](https://github.com/chasedehan/diaml):
- [signals-dev/greenguard](https://github.com/signals-dev/GreenGuard)
- [a-hanf/mlr3automl](https://github.com/a-hanf/mlr3automl):In this repository we are developing mlr3automl, an AutoML package for mlr3.The project started in April 2020 and is supposed to be working in October 2020
- [A2Amir/Machine-Learning-Pipelines](https://github.com/A2Amir/Machine-Learning-Pipelines):In this Repo, I am going to implement a message classifier by automating the machine-learning workflows with pipelines using the dataset of corporate messaging as a case study
- [mattjhayes/amle](https://github.com/mattjhayes/amle):AMLE is a simple unopinionated framework for experimenting with machine learning (ML). I built it to help me learn ML, and to reduce my workload running ML experiments, by automating repeatable tasks.
- [uncharted-distil/distil-auto-ml](https://github.com/uncharted-distil/distil-auto-ml):Distil Automated Machine Learning Server
- [MaximilianJohannesObpacher/automl_server](https://github.com/MaximilianJohannesObpacher/automl_server)
- [yangfenglong/mAML1.0](https://github.com/yangfenglong/mAML1.0):Automated machine learning model building pipeline for microbiome data
- [matheusccouto/autolearn](https://github.com/matheusccouto/autolearn):An uncomplicated API for simple problems.
- [gabrieljaguiar/mtlExperiment](https://github.com/gabrieljaguiar/mtlExperiment):Automated Machine Learning Experiments.
- [thomas-young-2013/soln-ml](https://github.com/thomas-young-2013/soln-ml)
- [rsheth80/pmf-automl](https://github.com/rsheth80/pmf-automl):
- [BeelGroup/auto-surprise](https://github.com/BeelGroup/Auto-Surprise)
- [melodyguan/ENAS](https://github.com/melodyguan/enas)
- [renqianluo/NAO](https://github.com/renqianluo/NAO)
- [laic-ufmg/automlc](https://github.com/laic-ufmg/automlc)
- [AlexImb/automl-streams](https://github.com/AlexImb/automl-streams)
- [knowledge-learning/hp-optimization](https://github.com/knowledge-learning/hp-optimization)
- [magnusax/magnusax/automl](https://github.com/magnusax/AutoML)
- [dstreamsai/ALEX_AutoML](https://github.com/dstreamsai/AutoML)
- [nitishkthakur/nitishkthakur/automlib](https://github.com/nitishkthakur/automlib)
- [DataSystemsGroupUT/iSmartML](https://github.com/DataSystemsGroupUT/ismartml)
- [udellgroup/Oboe](https://github.com/udellgroup/oboe)
- [plabig/Dino](https://github.com/plabig/Dino)
- [fillassuncao/automl-dsge](https://github.com/fillassuncao/automl-dsge)
- [piyushpathak03/Automated-Machine-Learning](https://github.com/piyushpathak03/Automated-Machine-Learning)
- [yaswanthpalaghat/Chatbot-using-machine-learning-and-flask](https://github.com/yaswanthpalaghat/Chatbot-using-machine-learning-and-flask):It is a Python project that that generates automated responses to a user’s input. It uses a selection of machine learning algorithms to produce different types of responses
- [paypal/autosklearn-zeroconf](https://github.com/paypal/autosklearn-zeroconf):The autosklearn-zeroconf file takes a dataframe of any size and trains auto-sklearn binary classifier ensemble. No configuration is needed as the name suggests.
- [AnyObject/OAT](https://github.com/AnyObject/OAT):Open Automatic Trading - A fully automated trading platform with machine learning capabilities
- [jwmueller/KDD20-tutorial](https://github.com/jwmueller/KDD20-tutorial)
- [mlaskowski17/Feature-Engineering](https://github.com/mlaskowski17/Feature-Engineering)
- [mstaddon/GraniteAI](https://github.com/mstaddon/GraniteAI)
- [EricCacciavillani/eFlow](https://github.com/EricCacciavillani/eFlow)
- [htoukour/AutoML](https://github.com/htoukour/AutoML)
- [aarontuor/antk](https://github.com/aarontuor/antk)
- [raalesir/automated_environment](https://github.com/raalesir/automated_environment)
- [wcneill/data-science-at-home](https://github.com/wcneill/data-science-at-home)
- [CodeSpaceHQ/MENGEL](https://github.com/CodeSpaceHQ/MENGEL)
- [TrixiaBelleza/Automated-Text-Classification](https://github.com/TrixiaBelleza/Automated-Text-Classification)
- [TwoRavens/TwoRavensSolver](https://github.com/TwoRavens/TwoRavensSolver)
- [shoprunback/openflow](https://github.com/shoprunback/openflow)
- [rahul1471/mlops](https://github.com/rahul1471/mlops)
- [flaviassantos/dashboard](https://github.com/flaviassantos/dashboard)
- [mattlm0831/AutoAI](https://github.com/mattlm0831/AutoAI)
- [RadheTians/Automated-Data-Augmentation-Software](https://github.com/RadheTians/Automated-Data-Augmentation-Software):This tool has been implemented for data augmentation and has capability to generate xml file for Machine learning or Deep Learning Models## benchmark
- [www.automl.ai](https://www.automl.ai/)
- [Alex-Lekov/automl-benchmark](https://github.com/Alex-Lekov/AutoML-Benchmark)
- [jonathankrauss/Automl-benchmark](https://jonathankrauss.github.io/AutoML-Benchmark/)
- [openml/automlbenckMark](https://github.com/openml/automlbenchmark)
- [jessecui/automl-benckmarking](https://github.com/jessecui/automl-benchmarking)
- [google-research/nasbench](https://github.com/google-research/nasbench)
- [automl/nas_benchmarks](https://github.com/automl/nas_benchmarks)
- [gaocegege.com/Blog/katib-new](http://gaocegege.com/Blog/%E6%9C%BA%E5%99%A8%E5%AD%A6%E4%B9%A0/katib-new#%E6%80%BB%E7%BB%93%E4%B8%8E%E5%88%86%E6%9E%90)
- [yash1994/auto-awesome-list](https://github.com/yash1994/auto-awesome-list): An automated list of Machine Learning and Data Science tools from research organizations
- [MaratSaidov/automl](https://github.com/MaratSaidov/automl)