Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
Awesome-AutoDL
Automated Deep Learning: Neural Architecture Search Is Not the End (a curated list of AutoDL resources and an in-depth analysis)
https://github.com/D-X-Y/Awesome-AutoDL
Last synced: 3 days ago
JSON representation
-
2020 Venues
- Are Labels Necessary for Neural Architecture Search? - |
- Neural Predictor for Neural Architecture Search - |
- Single Path One-Shot Neural Architecture Search with Uniform Sampling - |
- BigNAS: Scaling Up Neural Architecture Search with Big Single-Stage Models - |
- BATS: Binary ArchitecTure Search - | - |
- AttentionNAS: Spatiotemporal Attention Cell Search for Video Classification - | - |
- Search What You Want: Barrier Panelty NAS for Mixed Precision Quantization - | - |
- Angle-based Search Space Shrinking for Neural Architecture Search - | - |
- Anti-Bandit Neural Architecture Search for Model Defense - | - |
- TF-NAS: Rethinking Three Search Freedoms of Latency-Constrained Differentiable Neural Architecture Search - NAS) |
- Fair DARTS: Eliminating Unfair Advantages in Differentiable Architecture Search - automl/FairDARTS) |
- Off-Policy Reinforcement Learning for Efficient and Effective GAN Architecture Search - |
- DA-NAS: Data Adapted Pruning for Efficient Neural Architecture Search - |
- Optimizing Millions of Hyperparameters by Implicit Differentiation - |
- Evolving Machine Learning Algorithms From Scratch - |
- Stabilizing Differentiable Architecture Search via Perturbation-based Regularization - chen/SmoothDARTS) |
- NADS: Neural Architecture Distribution Search for Uncertainty Awareness - | - |
- Generative Teaching Networks: Accelerating Neural Architecture Search by Learning to Generate Synthetic Training Data - | - |
- Hit-Detector: Hierarchical Trinity Architecture Search for Object Detection - | [GitHub](https://github.com/ggjy/HitDet.pytorch) |
- Designing Network Design Spaces - | [GitHub](https://github.com/facebookresearch/pycls) |
- UNAS: Differentiable Architecture Search Meets Reinforcement Learning
- MiLeNAS: Efficient Neural Architecture Search via Mixed-Level Reformulation
- A Semi-Supervised Assessor of Neural Architectures - |
- Binarizing MobileNet via Evolution-based Searching - |
- Rethinking Performance Estimation in Neural Architecture Search - | [GitHub](https://github.com/zhengxiawu/rethinking_performance_estimation_in_NAS) |
- APQ: Joint Search for Network Architecture, Pruning and Quantization Policy - han-lab/apq) |
- SGAS: Sequential Greedy Architecture Search
- Can Weight Sharing Outperform Random Architecture Search? An Investigation With TuNAS - |
- FBNetV2: Differentiable Neural Architecture Search for Spatial and Channel Dimensions - vision) |
- AdversarialNAS: Adversarial Neural Architecture Search for GANs
- When NAS Meets Robustness: In Search of Robust Architectures against Adversarial Attacks
- Block-wisely Supervised Neural Architecture Search with Knowledge Distillation
- Overcoming Multi-Model Forgetting in One-Shot NAS with Diversity Maximization
- Densely Connected Search Space for More Flexible Neural Architecture Search
- EfficientDet: Scalable and Efficient Object Detection - |
- NAS-BENCH-201: Extending the Scope of Reproducible Neural Architecture Search - | [Github](https://github.com/D-X-Y/AutoDL-Projects) |
- Understanding Architectures Learnt by Cell-based Neural Architecture Search - NAS) |
- Evaluating The Search Phase of Neural Architecture Search - | |
- AtomNAS: Fine-Grained End-to-End Neural Architecture Search
- Fast Neural Network Adaptation via Parameter Remapping and Architecture Search - | [GitHub](https://github.com/JaminFong/FNA) |
- Once for All: Train One Network and Specialize it for Efficient Deployment - han-lab/once-for-all) |
- PC-DARTS: Partial Channel Connections for Memory-Efficient Architecture Search - DARTS) |
- NAS evaluation is frustratingly hard - | [GitHub](https://github.com/antoyang/NAS-Benchmark) |
- FasterSeg: Searching for Faster Real-time Semantic Segmentation - VITA/FasterSeg) |
- Computation Reallocation for Object Detection - | - |
- Towards Fast Adaptation of Neural Architectures with Meta Learning - | [GitHub](https://github.com/dongzelian/T-NAS) |
- AssembleNet: Searching for Multi-Stream Neural Connectivity in Video Architectures - |
- Fast, Accurate and Lightweight Super-Resolution with Neural Architecture Search
- Cream of the Crop: Distilling Prioritized Paths For One-Shot Neural Architecture Search - | [GitHub](https://github.com/microsoft/Cream) |
- Does Unsupervised Architecture Representation Learning Help Neural Architecture Search - MLSys-Lab/arch2vec) |
- RandAugment: Practical Automated Data Augmentation with a Reduced Search Space
- Delta-STN: Efficient Bilevel Optimization for Neural Networks using Structured Response Jacobians - Tuning-Networks) |
- A Study on Encodings for Neural Architecture Search
- AutoBSS: An Efficient Algorithm for Block Stacking Style Search
- Bridging the Gap between Sample-based and One-shot Neural Architecture Search with BONAS - channel-search) |
- Interstellar: Searching Recurrent Architecture for Knowledge Graph Embedding
- Revisiting Parameter Sharing for Automatic Neural Channel Number Search
- Learning Search Space Partition for Black-box Optimization using Monte Carlo Tree Search
- Neural Architecture Search using Deep Neural Networks and Monte Carlo Tree Search - NASBench101) |
- Representation Sharing for Fast Object Detector Search and Beyond - tech/research-fad) |
- Hit-Detector: Hierarchical Trinity Architecture Search for Object Detection - | [GitHub](https://github.com/ggjy/HitDet.pytorch) |
- PC-DARTS: Partial Channel Connections for Memory-Efficient Architecture Search - DARTS) |
- PyGlove: Symbolic Programming for Automated Machine Learning - |
- AssembleNet: Searching for Multi-Stream Neural Connectivity in Video Architectures - |
- Fast, Accurate and Lightweight Super-Resolution with Neural Architecture Search
-
2019 Venues
- Self-Tuning Networks: Bilevel Optimization of Hyperparameters using Structured Best-Response Functions - | - |
- DATA: Differentiable ArchiTecture Approximation - | - |
- Random Search and Reproducibility for Neural Architecture Search - X-Y/NAS-Projects/blob/master/scripts-search/algos/RANDOM-NAS.sh) |
- Improved Differentiable Architecture Search for Language Modeling and Named Entity Recognition - |
- Continual and Multi-Task Architecture Search - |
- Progressive Differentiable Architecture Search: Bridging the Depth Gap Between Search and Evaluation
- Multinomial Distribution Learning for Effective Neural Architecture Search - | [GitHub](https://github.com/tanglang96/MDENAS) |
- Searching for MobileNetV3 - |
- Multinomial Distribution Learning for Effective Neural Architecture Search - | [GitHub](https://github.com/tanglang96/MDENAS) |
- Fast and Practical Neural Architecture Search
- Teacher Guided Architecture Search - |
- AutoDispNet: Improving Disparity Estimation With AutoML - |
- Resource Constrained Neural Network Architecture Search: Will a Submodularity Assumption Help? - |
- One-Shot Neural Architecture Search via Self-Evaluated Template Network - X-Y/NAS-Projects) |
- Evolving Space-Time Neural Architectures for Videos - video) |
- AutoGAN: Neural Architecture Search for Generative Adversarial Networks - VITA/AutoGAN) |
- Discovering Neural Wirings
- Towards modular and programmable architecture search - X-Y/Awesome-NAS/issues/10) | [Github](https://github.com/negrinho/deep_architect) |
- Network Pruning via Transformable Architecture Search - X-Y/NAS-Projects) |
- Deep Active Learning with a NeuralArchitecture Search - | - |
- DetNAS: Backbone Search for Object Detection - model/DetNAS) |
- SpArSe: Sparse Architecture Search for CNNs on Resource-Constrained Microcontrollers - | - |
- Efficient Forward Architecture Search
- XNAS: Neural Architecture Search with Expert Advice
- DARTS: Differentiable Architecture Search
- ProxylessNAS: Direct Neural Architecture Search on Target Task and Hardware - HAN-LAB/ProxylessNAS) |
- Graph HyperNetworks for Neural Architecture Search - |
- Learnable Embedding Space for Efficient Neural Architecture Compression
- Efficient Multi-Objective Neural Architecture Search via Lamarckian Evolution - |
- SNAS: stochastic neural architecture search - |
- NetTailor: Tuning the Architecture, Not Just the Weights - morgado/nettailor) |
- Searching for A Robust Neural Architecture in Four GPU Hours - X-Y/NAS-Projects) |
- ChamNet: Towards Efficient Network Design through Platform-Aware Model Adaptation - | - |
- Partial Order Pruning: for Best Speed/Accuracy Trade-off in Neural Architecture Search - Order-Pruning) |
- FBNet: Hardware-Aware Efficient ConvNet Design via Differentiable Neural Architecture Search - |
- RENAS: Reinforced Evolutionary Neural Architecture Search - |
- Auto-DeepLab: Hierarchical Neural Architecture Search for Semantic Image Segmentation
- MnasNet: Platform-Aware Neural Architecture Search for Mobile - PyTorch) |
- MFAS: Multimodal Fusion Architecture Search - |
- A Neurobiological Evaluation Metric for Neural Network Model Search - |
- Fast Neural Architecture Search of Compact Semantic Segmentation Models via Auxiliary Cells - |
- Customizable Architecture Search for Semantic Segmentation - | - |
- Regularized Evolution for Image Classifier Architecture Search - |
- The Evolved Transformer
- EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks - |
- NAS-Bench-101: Towards Reproducible Neural Architecture Search - research/nasbench) |
- On Network Design Spaces for Visual Recognition
-
2018 Venues
- Towards Automatically-Tuned Deep Neural Networks - | [GitHub](https://github.com/automl/Auto-PyTorch) |
- NetAdapt: Platform-Aware Neural Network Adaptation for Mobile Applications - | [github](https://github.com/denru01/netadapt) |
- Efficient Architecture Search by Network Transformation - cai/EAS) |
- Learning Transferable Architectures for Scalable Image Recognition
- N2N learning: Network to Network Compression via Policy Gradient Reinforcement Learning - |
- A Flexible Approach to Automated RNN Architecture Generation - |
- Practical Block-wise Neural Network Architecture Generation - | [Efficient Neural Architecture Search via Parameter Sharing](http://proceedings.mlr.press/v80/pham18a.html) | ICML | RL | [github](https://github.com/melodyguan/enas) |
- Path-Level Network Transformation for Efficient Architecture Search - cai/PathLevel-EAS) |
- Hierarchical Representations for Efficient Architecture Search - |
- Understanding and Simplifying One-Shot Architecture Search - |
- SMASH: One-Shot Model Architecture Search through HyperNetworks
- Neural Architecture Optimization
- Searching for efficient multi-scale architectures for dense image prediction - |
- Progressive Neural Architecture Search
- Neural Architecture Search with Bayesian Optimisation and Optimal Transport
- Differentiable Neural Network Architecture Search - W | G | - |
- Accelerating Neural Architecture Search using Performance Prediction - W | PD | - |
- Learning Transferable Architectures for Scalable Image Recognition
-
2017 Venues
- Neural Architecture Search with Reinforcement Learning - |
- Designing Neural Network Architectures using Reinforcement Learning - | [github](https://github.com/bowenbaker/metaqnn) |
- Neural Optimizer Search with Reinforcement Learning - | [Large-Scale Evolution of Image Classifiers](https://arxiv.org/pdf/1703.01041.pdf) | ICML | EA | - |
- Learning Curve Prediction with Bayesian Neural Networks - |
- Hyperband: A Novel Bandit-Based Approach to Hyperparameter Optimization - |
- Hyperparameter Optimization: A Spectral Approach - W | Other | [github](https://github.com/callowbird/Harmonica) |
-
Previous Venues
-
arXiv
- NSGA-NET: A Multi-Objective Genetic Algorithm for Neural Architecture Search - |
- Population Based Training of Neural Networks
- EmotionNAS: Two-stream Architecture Search for Speech Emotion Recognition - |
- U-Boost NAS: Utilization-Boosted Differentiable Neural Architecture Search
- A Comprehensive Survey of Neural Architecture Search: Challenges and Solutions - |
- Automated Machine Learning on Graphs: A Survey - W | 2021 | [GitHub](https://github.com/THUMNLab/AutoGL) |
- On Hyperparameter Optimization of Machine Learning Algorithms: Theory and Practice - Optimization-of-Machine-Learning-Algorithms) |
- AutonoML: Towards an Integrated Framework for Autonomous Machine Learning - |
- Automated Machine Learning - |
- Neural architecture search: A survey - |
- A Survey on Neural Architecture Search - |
- Taking human out of learning applications: A survey on automated machine learning - |
- IoT Data Analytics in Dynamic Environments: From An Automated Machine Learning Perspective - OC2-Lab/AutoML-Implementation-for-Static-and-Dynamic-Data-Analytics) |
- Automated Machine Learning - |
- Automated Machine Learning - |
- Automated Machine Learning - |
- Automated Machine Learning - |
- Automated Machine Learning - |
- Automated Machine Learning - |
- Automated Machine Learning - |
- Automated Machine Learning - |
- Automated Machine Learning - |
- Automated Machine Learning - |
- Automated Machine Learning - |
- Automated Machine Learning - |
- Automated Machine Learning - |
- Automated Machine Learning - |
- Automated Machine Learning - |
- Automated Machine Learning - |
- Automated Machine Learning - |
- Automated Machine Learning - |
- Automated Machine Learning - |
- Automated Machine Learning - |
- Automated Machine Learning - |
- Automated Machine Learning - |
- Automated Machine Learning - |
- Automated Machine Learning - |
- Automated Machine Learning - |
- Automated Machine Learning - |
- Automated Machine Learning - |
- Automated Machine Learning - |
- Automated Machine Learning - |
- Automated Machine Learning - |
- Automated Machine Learning - |
- Automated Machine Learning - |
- Automated Machine Learning - |
- Automated Machine Learning - |
- Training Frankenstein’s Creature to Stack: HyperTree Architecture Search - |
- U-Boost NAS: Utilization-Boosted Differentiable Neural Architecture Search
- Automated Machine Learning - |
- Automated Machine Learning - |
- Automated Machine Learning - |
- Automated Machine Learning - |
- Automated Machine Learning - |
- Automated Machine Learning - |
- Automated Machine Learning - |
- Automated Machine Learning - |
- Automated Machine Learning - |
- Automated Machine Learning - |
- Automated Machine Learning - |
- Automated Machine Learning - |
- Automated Machine Learning - |
-
2021 Venues
- CATE: Computation-aware Neural Architecture Encoding with Transformers - MLSys-Lab/CATE) |
- Searching by Generating: Flexible and Efficient One-Shot NAS with Architecture Generator
- Zen-NAS: A Zero-Shot NAS for High-Performance Deep Image Recognition
- FBNetV3: Joint Architecture-Recipe Search using Neural Acquisition Function - vision/blob/main/mobile_cv/arch/fbnet_v2/fbnet_modeldef_cls_fbnetv3.py) |
- AutoFormer: Searching Transformers for Visual Recognition
- LightTrack: Finding Lightweight Neural Networks for Object Tracking via One-Shot Architecture Search
- One-Shot Neural Ensemble Architecture Search by Diversity-Guided Search Space Shrinking
- DARTS-: Robustly Stepping out of Performance Collapse Without Indicators - AutoML/DARTS-) |
- Zero-Cost Proxies for Lightweight NAS - cost-nas) |
- Neural Architecture Search on ImageNet in Four GPU Hours: A Theoretically Inspired Perspective - | [GitHub](https://github.com/VITA-Group/TENAS) |
- DrNAS: Dirichlet Neural Architecture Search - chen/DrNAS) |
- Rethinking Architecture Selection in Differentiable NAS - pt) |
- Evolving Reinforcement Learning Algorithms
- AutoHAS: Differentiable Hyper-parameter and Architecture Search - W | G | - |
- AutoFormer: Searching Transformers for Visual Recognition
-
Uncategorized
Categories
Sub Categories