Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/airoldilab/ai-sgd
Towards stability and optimality in stochastic gradient descent
https://github.com/airoldilab/ai-sgd
Last synced: about 2 months ago
JSON representation
Towards stability and optimality in stochastic gradient descent
- Host: GitHub
- URL: https://github.com/airoldilab/ai-sgd
- Owner: airoldilab
- Created: 2014-12-06T23:35:35.000Z (about 10 years ago)
- Default Branch: master
- Last Pushed: 2015-05-21T20:30:25.000Z (over 9 years ago)
- Last Synced: 2024-03-27T12:18:03.817Z (9 months ago)
- Language: R
- Homepage:
- Size: 656 KB
- Stars: 5
- Watchers: 8
- Forks: 2
- Open Issues: 9
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Stability and optimality in stochastic gradient descent
This is the accompanying code implementation of the methods and algorithms
for a paper in progress.## Maintainer
* Dustin Tran \## References
* Francis Bach and Eric Moulines. Non-strongly-convex smooth stochastic
approximation with convergence rate O(1/n). *Advances in Neural Information
Processing Systems*, 2013.
* Jerome Friedman, Trevor Hastie, and Robert Tibshirani. Regularization paths
for generalized linear models via coordinate descent. *Journal of Statistical
Software*, 33(1):1-22, 2010.
* Rie Johnson and Tong Zhang. Accelerating stochastic gradient descent using
predictive variance reduction. *Advances in Neural Information Processing
Systems*, 2013.
* David Ruppert. Efficient estimations from a slowly convergent robbins-monro
process. Technical report, Cornell University Operations Research and
Industrial Engineering, 1988.
* Wei Xu. Towards optimal one pass large scale learning with averaged stochastic
gradient descent. *arXiv preprint
[arXiv:1107.2490](http://arxiv.org/abs/1107.2490)*, 2011.