Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/fabianp/hoag
Hyperparameter optimization with approximate gradient
https://github.com/fabianp/hoag
Last synced: 26 days ago
JSON representation
Hyperparameter optimization with approximate gradient
- Host: GitHub
- URL: https://github.com/fabianp/hoag
- Owner: fabianp
- Created: 2016-02-07T19:07:16.000Z (almost 9 years ago)
- Default Branch: master
- Last Pushed: 2021-03-23T18:07:00.000Z (over 3 years ago)
- Last Synced: 2024-10-04T23:09:04.197Z (about 1 month ago)
- Language: Python
- Size: 1010 KB
- Stars: 65
- Watchers: 9
- Forks: 24
- Open Issues: 1
-
Metadata Files:
- Readme: README.rst
Awesome Lists containing this project
README
.. image:: https://travis-ci.org/fabianp/hoag.svg?branch=master
:target: https://travis-ci.org/fabianp/hoagHOAG
====
Hyperparameter optimization with approximate gradient.. image:: https://raw.githubusercontent.com/fabianp/hoag/master/doc/comparison_ho_real_sim.png
:scale: 50 %Depends
-------* scikit-learn 0.16
Usage
-----This package exports a LogisticRegressionCV class which automatically estimates the L2 regularization of logistic regression. As other scikit-learn objects, it has a .fit and .predict method. However, unlike scikit-learn objects, the .fit method takes 4 arguments consisting of the train set and the test set. For example:
>>> from hoag import LogisticRegressionCV
>>> clf = LogisticRegressionCV()
>>> clf.fit(X_train, y_train, X_test, y_test)where X_train, y_train, X_test, y_test are numpy arrays representing the train and test set, respectively.
For full usage example check out `this ipython notebook `_.
.. image:: https://raw.githubusercontent.com/fabianp/hoag/master/doc/hoag_screenshot.png
:target: https://github.com/fabianp/hoag/blob/master/doc/example_usage.ipynbUsage tips
----------Standardize features of the input data such that each feature has unit variance. This makes the Hessian better conditioned. This can be done using e.g. scikit-learn's StandardScaler.
Citing
------If you use this, please cite it as
.. code-block::
@inproceedings{PedregosaHyperparameter16,
author = {Fabian Pedregosa},
title = {Hyperparameter optimization with approximate gradient},
booktitle = {Proceedings of the 33nd International Conference on Machine Learning,
{ICML}},
year = {2016},
url = {http://jmlr.org/proceedings/papers/v48/pedregosa16.html},
}