Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/yzhuoning/deepauc
Stochastic AUC Maximization with Deep Neural Networks
https://github.com/yzhuoning/deepauc
auc-roc-score deep-neural-networks optimization
Last synced: about 2 months ago
JSON representation
Stochastic AUC Maximization with Deep Neural Networks
- Host: GitHub
- URL: https://github.com/yzhuoning/deepauc
- Owner: yzhuoning
- Created: 2020-06-17T20:05:44.000Z (over 4 years ago)
- Default Branch: master
- Last Pushed: 2021-04-01T19:03:27.000Z (over 3 years ago)
- Last Synced: 2023-10-20T23:58:35.166Z (about 1 year ago)
- Topics: auc-roc-score, deep-neural-networks, optimization
- Language: Python
- Homepage:
- Size: 24.4 KB
- Stars: 15
- Watchers: 3
- Forks: 2
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Deep AUC Maximization [![pdf](https://img.shields.io/badge/Arxiv-pdf-orange.svg?style=flat)](https://arxiv.org/abs/1908.10831)
This is the official implementation of the paper "**Stochastic AUC Maximization with Deep Neural Networks**" published on **ICLR2020**.
## Installation
```
Python=3.5
Numpy=1.18.5
Scipy=1.2.1
Scikit-Learn=0.20.3
Pillow=5.0.0
Tensorflow>=1.10.0
```### Run
```
python PPD_SG.py/PPD_AdaGrad.py --dataset=10 --train_batch_size=128 --use_L2=False --split_index=4 --lr=0.01 --keep_index=0.1 --t0=200
```### Hyperparameter tuning
```
gamma=[500, 1000, 2000, ...]
eta = [0.1, 0.01, ...]
T0=[1000, 2000, 3000, ...,]
```## Bibtex
If you use this repository in your work, please cite our paper:```
@inproceedings{
Liu2020Stochastic,
title={Stochastic AUC Maximization with Deep Neural Networks},
author={Mingrui Liu and Zhuoning Yuan and Yiming Ying and Tianbao Yang},
booktitle={International Conference on Learning Representations},
year={2020},
url={https://openreview.net/forum?id=HJepXaVYDr}
}
```