Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
awesome-nonconvex-optimization
A collection of papers and readings for non-convex optimization
https://github.com/cshjin/awesome-nonconvex-optimization
Last synced: 1 day ago
JSON representation
-
Before 2010s
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Introductory Lectures on Convex Programming Volume: A Basic course
- Prox-method with rate of convergence o(1/t) for variational inequalities with lipschitz continuous monotone operators and smooth convex-concave saddle point problems
- Smooth minimization of non-smooth functions
- Cubic regularization of Newton method and its global performance
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
- Smooth minimization of non-smooth functions
-
2011
- A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
- A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
- A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
- A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
- A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
- A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
- A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
- A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
- A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
- A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
- A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
- A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
- A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
- A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
- A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
- A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
- A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
- A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
- A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
- A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
- A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
- A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
- A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
- A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
- A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
- A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
- A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
- A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
- A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
- A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
- A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
- A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
- A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
- A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
- A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
- A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
- A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
- A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
- A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
- A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
- A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
- A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
- A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
- A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
-
2014
- Accelerated Proximal Stochastic Dual Coordinate Ascent for Regularized Loss Minimization
- An Accelerated Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization
- SAGA: A Fast Incremental Gradient Method With Support for Non-Strongly Convex Composite Objectives
- A differential equation for modeling nesterovs accelerated gradient method: Theory and insights
- Linear Coupling: An Ultimate Unification of Gradient and Mirror Descent
- Stochastic proximal gradient descent with acceleration techniques
- Guaranteed Matrix Completion via Non-convex Factorization
- Stochastic proximal gradient descent with acceleration techniques
- Accelerated Proximal Stochastic Dual Coordinate Ascent for Regularized Loss Minimization
- A differential equation for modeling nesterovs accelerated gradient method: Theory and insights
- Linear Coupling: An Ultimate Unification of Gradient and Mirror Descent
- Guaranteed Matrix Completion via Non-convex Factorization
-
2012
-
2013
- Accelerated Proximal Stochastic Dual Coordinate Ascent for Regularized Loss Minimization
- Lectures on Modern Convex Optimization
- Efficient accelerated coordinate descent methods and faster algorithms for solving linear systems
- Linear convergence with condition number independent access of full gradients
- Mixed optimization for smooth functions
- Stochastic First- and Zeroth-order Methods for Nonconvex Stochastic Programming
- Linear convergence with condition number independent access of full gradients
- Mixed optimization for smooth functions
- Stochastic First- and Zeroth-order Methods for Nonconvex Stochastic Programming
- Accelerating stochastic gradient descent using predictive variance reduction
-
2015
- Stochastic Primal-Dual Coordinate Method for Regularized Empirical Risk Minimization
- An optimal randomized incremental gradient method
- Un-regularizing: approximate proximal point and faster stochastic algorithms for empirical risk minimization
- A universal catalyst for first-order optimization
- Improved SVRG for Non-Strongly-Convex or Sum-of-Non-Convex Objectives
- A geometric alternative to Nesterov's accelerated gradient descent
- Faster Eigenvector Computation via Shift-and-Invert Preconditioning
- Fast and Simple PCA via Convex Optimization
- Simple, Efficient, and Neural Algorithms for Sparse Coding
- Solving Random Quadratic Systems of Equations Is Nearly as Easy as Solving Linear Systems
- SDCA without Duality
- Escaping From Saddle Points --- Online Stochastic Gradient for Tensor Decomposition
- SDCA without Duality
- Escaping From Saddle Points --- Online Stochastic Gradient for Tensor Decomposition
- Stochastic Primal-Dual Coordinate Method for Regularized Empirical Risk Minimization
- An optimal randomized incremental gradient method
- Un-regularizing: approximate proximal point and faster stochastic algorithms for empirical risk minimization
- A universal catalyst for first-order optimization
- Improved SVRG for Non-Strongly-Convex or Sum-of-Non-Convex Objectives
- A geometric alternative to Nesterov's accelerated gradient descent
- Faster Eigenvector Computation via Shift-and-Invert Preconditioning
- Fast and Simple PCA via Convex Optimization
- Simple, Efficient, and Neural Algorithms for Sparse Coding
- Solving Random Quadratic Systems of Equations Is Nearly as Easy as Solving Linear Systems
-
2016
- Even Faster Accelerated Coordinate Descent Using Non-Uniform Sampling
- Optimal Black-Box Reductions Between Optimization Objectives
- Variance Reduction for Faster Non-Convex Optimization
- A Variational Perspective on Accelerated Methods in Optimization
- Tight Complexity Bounds for Optimizing Composite Objectives
- Katyusha: The First Direct Acceleration of Stochastic Gradient Methods
- Principal Component Projection Without Principal Component Analysis
- LazySVD: Even Faster SVD Decomposition Yet Without Agonizing Pain
- Stochastic variance reduction for nonconvex optimization
- Fast incremental method for nonconvex optimization
- Fast Stochastic Methods for Nonsmooth Nonconvex Optimization
- Accelerated Methods for Non-Convex Optimization
- Finding Approximate Local Minima Faster Than Gradient Descent
- LazySVD: Even Faster SVD Decomposition Yet Without Agonizing Pain
- Fast Stochastic Methods for Nonsmooth Nonconvex Optimization
- Optimal Black-Box Reductions Between Optimization Objectives
- Even Faster Accelerated Coordinate Descent Using Non-Uniform Sampling
- Variance Reduction for Faster Non-Convex Optimization
- A Variational Perspective on Accelerated Methods in Optimization
- Katyusha: The First Direct Acceleration of Stochastic Gradient Methods
- Tight Complexity Bounds for Optimizing Composite Objectives
- Stochastic variance reduction for nonconvex optimization
- Fast incremental method for nonconvex optimization
- Accelerated Methods for Non-Convex Optimization
- Finding Approximate Local Minima Faster Than Gradient Descent
-
2017
- Less than a Single Pass: Stochastically Controlled Stochastic Gradient
- Nonconvex Finite-Sum Optimization Via SCSG Methods
- Faster Principal Component Regression and Stable Matrix Chebyshev Approximation
- Stochastic primal dual coordinate method with nonuniform sampling based on optimality violations
- Doubly accelerated stochastic variance reduced dual averaging method for regularized empirical risk minimization
- Convergence Analysis of Two-layer Neural Networks with ReLU Activation
- Natasha: Faster Non-Convex Stochastic Optimization via Strongly Non-Convex Parameter
- "Convex Until Proven Guilty": Dimension-Free Acceleration of Gradient Descent on Non-Convex Functions
- How to Escape Saddle Points Efficiently
- Mirror descent in non-convex stochastic programming
- Faster Principal Component Regression and Stable Matrix Chebyshev Approximation
- Mirror descent in non-convex stochastic programming
- Doubly accelerated stochastic variance reduced dual averaging method for regularized empirical risk minimization
- Less than a Single Pass: Stochastically Controlled Stochastic Gradient
- Nonconvex Finite-Sum Optimization Via SCSG Methods
- Stochastic primal dual coordinate method with nonuniform sampling based on optimality violations
- Convergence Analysis of Two-layer Neural Networks with ReLU Activation
- Natasha: Faster Non-Convex Stochastic Optimization via Strongly Non-Convex Parameter
- "Convex Until Proven Guilty": Dimension-Free Acceleration of Gradient Descent on Non-Convex Functions
- How to Escape Saddle Points Efficiently
-
2018
- Natasha2: Faster Non-Convex Optimization Than SGD
- Katyusha X: Practical Momentum Method for Stochastic Sum-of-Nonconvex Optimization
- Neon2: Finding Local Minima via First-Order Oracles
- Global Optimality Conditions for Deep Neural Networks
- Gradient Primal-Dual Algorithm Converges to Second-Order Stationary Solutions for Nonconvex Distributed Optimization
- DAGs with NO TEARS: Smooth Optimization for Structure Learning
- First-order Stochastic Algorithms for Escaping From Saddle Points in Almost Linear Time
- Natasha2: Faster Non-Convex Optimization Than SGD
- Global Optimality Conditions for Deep Neural Networks
- Katyusha X: Practical Momentum Method for Stochastic Sum-of-Nonconvex Optimization
- Neon2: Finding Local Minima via First-Order Oracles
- Gradient Primal-Dual Algorithm Converges to Second-Order Stationary Solutions for Nonconvex Distributed Optimization
- DAGs with NO TEARS: Smooth Optimization for Structure Learning
-
Licenses
Categories
Sub Categories