Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
awesome-AutoML
Curating a list of AutoML-related research, tools, projects and other resources
https://github.com/windmaple/awesome-AutoML
Last synced: 5 days ago
JSON representation
-
Research papers
-
AutoML survey
- Neural architecture search: a survey 深度神经网络结构搜索综述
- Neural architecture search: a survey 深度神经网络结构搜索综述
- AutoML to Date and Beyond: Challenges and Opportunities
- A Comprehensive Survey of Neural Architecture Search: Challenges and Solutions
- On Hyperparameter Optimization of Machine Learning Algorithms: Theory and Practice
- Benchmark and Survey of Automated Machine Learning Frameworks
- AutoML: A Survey of the State-of-the-Art
- A Survey on Neural Architecture Search
- Taking Human out of Learning Applications: A Survey on Automated Machine Learning
- Neural architecture search: a survey 深度神经网络结构搜索综述
-
Neural Architecture Search
- Speedy Performance Estimation for Neural Architecture Search
- LayerNAS: Neural Architecture Search in Polynomial Complexity
- EvoPrompting: Language Models for Code-Level Neural Architecture Search
- Neural Architecture Search using Property Guided Synthesis
- Data-Free Neural Architecture Search via Recursive Label Calibration
- Searching for Efficient Neural Architectures for On-Device ML on Edge TPUs
- Resource-Constrained Neural Architecture Search on Tabular Datasets
- Searching for Fast Model Families on Datacenter Accelerators
- Towards the co-design of neural networks and accelerators
- Neural Architecture Search for Energy Efficient Always-on Audio Models
- KNAS: Green Neural Architecture Search
- Primer: Searching for Efficient Transformers for Language Modeling
- NAS-BERT: Task-Agnostic and Adaptive-Size BERT Compression with Neural Architecture Search
- Accelerating Neural Architecture Search for Natural Language Processing with Knowledge Distillation and Earth Mover's Distance
- AlphaNet: Improved Training of Supernets with Alpha-Divergence
- AttentiveNAS: Improving Neural Architecture Search via Attentive Sampling
- AutoFormer: Searching Transformers for Visual Recognition
- NAAS: Neural Accelerator Architecture Search
- ModularNAS: Towards Modularized and Reusable Neural Architecture Search
- BossNAS: Exploring Hybrid CNN-transformers with Block-wisely Self-supervised Neural Architecture Search
- AutoReCon: Neural Architecture Search-based Reconstruction for Data-free Compression
- AutoSpace: Neural Architecture Search with Less Human Interference
- ReNAS:Relativistic Evaluation of Neural Architecture Search
- Zen-NAS: A Zero-Shot NAS for High-Performance Deep Image Recognition
- PyGlove: Symbolic Programming for Automated Machine Learning
- DARTS-: Robustly Stepping out of Performance Collapse Without Indicators
- NAS-DIP: Learning Deep Image Prior with Neural Architecture Search
- AttentionNAS: Spatiotemporal Attention Cell Search for Video Classification
- CurveLane-NAS: Unifying Lane-Sensitive Architecture Search and Adaptive Point Blending
- SpArSe: Sparse Architecture Search for CNNs on Resource-Constrained Microcontrollers
- Few-shot Neural Architecture Search
- Efficient Neural Architecture Search via Proximal Iterations
- Cream of the Crop: Distilling Prioritized Paths For One-Shot Neural Architecture Search
- How Does Supernet Help in Neural Architecture Search?
- CurveLane-NAS: Unifying Lane-Sensitive Architecture Search and Adaptive Point Blending
- APQ: Joint Search for Network Architecture, Pruning and Quantization Policy
- MCUNet: Tiny Deep Learning on IoT Devices
- FBNetV2: Differentiable Neural Architecture Search for Spatial and Channel Dimensions
- MobileDets: Searching for Object Detection Architectures for Mobile Accelerators
- Neural Architecture Transfer
- When NAS Meets Robustness: In Search of Robust Architectures against Adversarial Attacks
- Semi-Supervised Neural Architecture Search
- MixPath: A Unified Approach for One-shot Neural Architecture Search
- AutoML-Zero: Evolving Machine Learning Algorithms From Scratch
- Generative Teaching Networks: Accelerating Neural Architecture Search by Learning to Generate Synthetic Training Data
- CARS: Continuous Evolution for Efficient Neural Architecture Search
- Meta-Learning of Neural Architectures for Few-Shot Learning
- Up to two billion times acceleration of scientific simulations with deep neural architecture search
- Efficient Forward Architecture Search
- Towards Oracle Knowledge Distillation with Neural Architecture Search
- Blockwisely Supervised Neural Architecture Search with Knowledge Distillation
- NAS-FPN: Learning Scalable Feature Pyramid Architecture for Object Detection
- Improving Keyword Spotting and Language Identification via Neural Architecture Search at Scale
- SpineNet: Learning Scale-Permuted Backbone for Recognition and Localization
- Efficient Neural Interaction Function Search for Collaborative Filtering
- Evaluating the Search Phase of Neural Architecture Search
- MixConv: Mixed Depthwise Convolutional Kernels
- Multinomial Distribution Learning for Effective Neural Architecture Search
- SNR: Sub-Network Routing for Flexible Parameter Sharing in Multi-task Learning
- PC-DARTS: Partial Channel Connections for Memory-Efficient Differentiable Architecture Search - [code](https://github.com/yuhuixu1993/PC-DARTS)
- Single Path One-Shot Neural Architecture Search with Uniform Sampling
- AutoGAN: Neural Architecture Search for Generative Adversarial Networks
- MixConv: Mixed Depthwise Convolutional Kernels
- Tiny Video Networks
- AssembleNet: Searching for Multi-Stream Neural Connectivity in Video Architectures
- EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks
- MoGA: Searching Beyond MobileNetV3 - [code](https://github.com/xiaomi-automl/MoGA)
- Searching for MobileNetV3
- Auto-DeepLab: Hierarchical Neural Architecture Search for Semantic Image Segmentation
- DetNAS: Backbone Search for Object Detection
- Graph HyperNetworks for Neural Architecture Search
- Dynamic Distribution Pruning for Efficient Network Architecture Search
- FairNAS: Rethinking Evaluation Fairness of Weight Sharing Neural Architecture Search
- EENA: Efficient Evolution of Neural Architecture
- Single Path One-Shot Neural Architecture Search with Uniform Sampling
- InstaNAS: Instance-aware Neural Architecture Search
- ProxylessNAS: Direct Neural Architecture Search on Target Task and Hardware
- Evolutionary Neural AutoML for Deep Learning
- Fast, Accurate and Lightweight Super-Resolution with Neural Architecture Search
- The Evolved Transformer
- SNAS: Stochastic Neural Architecture Search
- NeuNetS: An Automated Synthesis Engine for Neural Network Design
- EAT-NAS: Elastic Architecture Transfer for Accelerating Large-scale Neural Architecture Search
- Understanding and Simplifying One-Shot Architecture Search
- Evolving Space-Time Neural Architectures for Videos
- IRLAS: Inverse Reinforcement Learning for Architecture Search
- Neural Architecture Search with Bayesian Optimisation and Optimal Transport
- Path-Level Network Transformation for Efficient Architecture Search
- BlockQNN: Efficient Block-wise Neural Network Architecture Generation
- Stochastic Adaptive Neural Architecture Search for Keyword Spotting
- Task-Driven Convolutional Recurrent Models of the Visual System
- Neural Architecture Optimization
- MnasNet: Platform-Aware Neural Architecture Search for Mobile
- MONAS: Multi-Objective Neural Architecture Search using Reinforcement Learning
- NetAdapt: Platform-Aware Neural Network Adaptation for Mobile Applications
- Auto-Meta: Automated Gradient Based Meta Learner Search
- MorphNet: Fast & Simple Resource-Constrained Structure Learning of Deep Networks
- DPP-Net: Device-aware Progressive Search for Pareto-optimal Neural Architectures
- Searching Toward Pareto-Optimal Device-Aware Neural Architectures
- Differentiable Architecture Search
- Regularized Evolution for Image Classifier Architecture Search
- Efficient Architecture Search by Network Transformation
- Large-Scale Evolution of Image Classifiers
- Progressive Neural Architecture Search
- AdaNet: Adaptive Structural Learning of Artificial Neural Networks
- Learning Transferable Architectures for Scalable Image Recognition
- EfficientNet-EdgeTPU: Creating Accelerator-Optimized Neural Networks with AutoML
- Neural Architecture Search using Property Guided Synthesis
- AlphaNet: Improved Training of Supernets with Alpha-Divergence
- AttentiveNAS: Improving Neural Architecture Search via Attentive Sampling
- NAAS: Neural Accelerator Architecture Search
- AutoSpace: Neural Architecture Search with Less Human Interference
- CurveLane-NAS: Unifying Lane-Sensitive Architecture Search and Adaptive Point Blending
- MCUNet: Tiny Deep Learning on IoT Devices
- Graph HyperNetworks for Neural Architecture Search
- Neural Architecture Search: A Survey
- Accelerating Neural Architecture Search for Natural Language Processing with Knowledge Distillation and Earth Mover's Distance
- DARTS-: Robustly Stepping out of Performance Collapse Without Indicators
-
Federated Neural Architecture Search
-
Neural Optimizatizer Search
-
Activation function Search
-
AutoAugment
- MetaAugment: Sample-Aware Data Augmentation Policy Learning
- SpecAugment: A Simple Data Augmentation Method for Automatic Speech Recognition
- RandAugment: Practical automated data augmentation with a reduced search space
- Learning Data Augmentation Strategies for Object Detection
- Fast AutoAugment
- AutoAugment: Learning Augmentation Policies from Data
-
Graph neural network
-
AutoDropout
-
AutoDistill
-
Learning to learn/Meta-learning
- ES-MAML: Simple Hessian-Free Meta Learning
- Learning to Learn with Gradients
- On First-Order Meta-Learning Algorithms
- Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks
- A sample neural attentive meta-learner
- Learning to Learn without Gradient Descent by Gradient Descent
- Learning to learn by gradient descent by gradient descent
- Learning to reinforcement learn
- RL^2: Fast Reinforcement Learning via Slow Reinforcement Learning
-
Hyperparameter optimization
- Frugal Optimization for Cost-related Hyperparameters
- Economical Hyperparameter Optimization With Blended Search Strategy
- ChaCha for Online AutoML
- Using a thousand optimization tasks to learn hyperparameter search strategies
- AutoNE: Hyperparameter Optimization for Massive Network Embedding
- Google Vizier: A Service for Black-Box Optimization
- Hyperband: A Novel Bandit-Based Approach to Hyperparameter Optimization
- Practical Bayesian Optimization of Machine Learning Algorithms
- Random Search for Hyper-Parameter Optimization
- Population Based Training of Neural Networks
- Practical Bayesian Optimization of Machine Learning Algorithms
- OptFormer: Towards Universal Hyperparameter Optimization with Transformers
-
Automatic feature selection
-
Recommendation systems
- Rankitect: Ranking Architecture Search Battling World-class Engineers at Meta Scale
- AutoML for Deep Recommender Systems: A Survey
- Automated Machine Learning for Deep Recommender Systems: A Survey
- AutoEmb: Automated Embedding Dimensionality Search in Streaming Recommendations
- AutoDim: Field-aware Embedding Dimension Searchin Recommender Systems
- Learnable Embedding Sizes for Recommender Systems
- AMER: Automatic Behavior Modeling and Interaction Exploration in Recommender System
- AIM: Automatic Interaction Machine for Click-Through Rate Prediction
- Towards Automated Neural Interaction Discovery for Click-Through Rate Prediction
- AutoFIS: Automatic Feature Interaction Selection in Factorization Models for Click-Through Rate Prediction
- AutoFeature: Searching for Feature Interactions and Their Architectures for Click-through Rate Prediction
- AutoGroup: Automatic Feature Grouping for Modelling Explicit High-Order Feature Interactions in CTR Prediction
- Neural Input Search for Large Scale Recommendation Models
- AutoGroup: Automatic Feature Grouping for Modelling Explicit High-Order Feature Interactions in CTR Prediction
- AutoDim: Field-aware Embedding Dimension Searchin Recommender Systems
-
Model compression
-
Quantization
-
Tech to speech
-
Bandits
-
Reinforcement learning
-
Quantum computing
-
Prompt search
-
Neural Architecture Search benchmark
- NAS-Bench-101: Towards Reproducible Neural Architecture Search - [code](https://github.com/google-research/nasbench)
-
LLM
-
-
Commercial products
-
Tools and projects
-
LLM
- Vegas
- AutoGluon
- H2O AutoML
- Ray Tune
- TransmogrifAI
- AutoKeras
- Ludwig
- AutoWeka
- TPOT - source software packages
- Optuna - by-run hypterparameter optimization framework
- FeatureTools
- Falcon
- MindWare - source AutoML System
- AutoDL
- AutoGL
- MLBox
- FLAML - us/research/publication/flaml-a-fast-and-lightweight-automl-library/))
- Hypernets
- Cooka
- Vegas
- TransmogrifAI
- Model Search
- hyperunity - box hyperparameter optimisation
- auptimizer
- Keras Tuner
- Torchmeta - Learning library for PyTorch
- learn2learn - learning Framework for Researchers
- Auto-PyTorch
- ATM: Auto Tune Models - tenant, multi-data system for automated machine learning (model selection and tuning)
- Adanet: Fast and flexible AutoML with learning guarantees
- Microsoft Neural Network Intelligence (NNI) - parameter tuning
- Dragonfly
-
-
Benchmarks
-
Blog posts
-
LLM
- Efficient Multi-Objective Neural Architecture Search with Ax
- Neural Architecture Search
- How we use AutoML, Multi-task learning and Multi-tower models for Pinterest Ads
- A Conversation With Quoc Le: The AI Expert Behind Google AutoML
- fast.ai: An Opinionated Introduction to AutoML and Neural Architecture Search
- Machine Learning Hyperparameter Optimization with Argo
- AutoML Solutions: What I Like and Don’t Like About AutoML as a Data Scientist
- Improved On-Device ML on Pixel 6, with Neural Architecture Search
- Introducing AdaNet: Fast and Flexible AutoML with Learning Guarantees
- Using Evolutionary AutoML to Discover Neural Network Architectures
- Improving Deep Learning Performance with AutoAugment
- AutoML for large scale image classification and object detection
- Using Machine Learning to Discover Neural Network Optimizers
- Using Machine Learning to Explore Neural Network Architecture
-
-
Courses
-
Presentations
-
Books
-
LLM
- AUTOML: METHODS, SYSTEMS, CHALLENGES
- Hands-On Meta Learning with Python: Meta learning using one-shot learning, MAML, Reptile, and Meta-SGD with TensorFlow - [repo](https://github.com/sudharsan13296/Hands-On-Meta-Learning-With-Python)
- Automated Machine Learning in Action - A book that introduces autoML with AutoKreas and Keras Tuner
-
-
Competitions, workshops and conferences
-
Other curated resources on AutoML
Programming Languages
Categories
Sub Categories
Neural Architecture Search
118
LLM
74
Recommendation systems
15
Hyperparameter optimization
12
AutoML survey
10
Learning to learn/Meta-learning
9
AutoAugment
6
Prompt search
4
Reinforcement learning
3
Quantum computing
3
Federated Neural Architecture Search
2
Neural Optimizatizer Search
2
Automatic feature selection
2
Graph neural network
2
Neural Architecture Search benchmark
1
AutoDropout
1
Tech to speech
1
AutoDistill
1
Quantization
1
Model compression
1
Bandits
1
Activation function Search
1
Keywords
automl
14
machine-learning
13
deep-learning
10
data-science
8
hyperparameter-optimization
8
automated-machine-learning
7
python
6
pytorch
6
neural-architecture-search
5
tensorflow
4
bayesian-optimization
3
nas
3
keras
3
meta-learning
3
feature-engineering
3
hyperparameter-tuning
3
neural-network
2
classification
2
distributed
2
lightgbm
2
autodl
2
ml
2
regression
2
ai
2
hyper-parameter-optimization
2
tabular-data
2
finetuning
2
jupyter-notebook
1
drift
1
hyperparam
1
encoding
1
kaggle
1
xgboost
1
optimization
1
stacking
1
pipeline
1
prediction
1
preprocessing
1
falcon-ml
1
automl-algorithms
1
automl-pipeline
1
blackbox-optimization
1
distributed-systems
1
ensemble-learning
1
knobs-tuning
1
artificial-intelligence
1
autodl-challenge
1
big-data
1
deeplearning
1
full-automl
1