{"id":13487131,"url":"https://github.com/maxpumperla/hyperas","last_synced_at":"2025-05-14T12:08:34.307Z","repository":{"id":40643749,"uuid":"52094077","full_name":"maxpumperla/hyperas","owner":"maxpumperla","description":"Keras + Hyperopt: A very simple wrapper for convenient hyperparameter optimization","archived":false,"fork":false,"pushed_at":"2023-01-05T06:02:49.000Z","size":776,"stargazers_count":2180,"open_issues_count":97,"forks_count":318,"subscribers_count":61,"default_branch":"master","last_synced_at":"2025-04-11T04:57:30.142Z","etag":null,"topics":["hyperopt","hyperparameter-optimization","keras"],"latest_commit_sha":null,"homepage":"http://maxpumperla.com/hyperas/","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/maxpumperla.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":".github/FUNDING.yml","license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null},"funding":{"github":null,"patreon":null,"open_collective":null,"ko_fi":null,"tidelift":null,"community_bridge":null,"liberapay":null,"issuehunt":null,"otechie":null,"custom":null}},"created_at":"2016-02-19T14:45:10.000Z","updated_at":"2025-04-06T17:14:34.000Z","dependencies_parsed_at":"2023-02-03T14:47:19.079Z","dependency_job_id":null,"html_url":"https://github.com/maxpumperla/hyperas","commit_stats":null,"previous_names":[],"tags_count":9,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/maxpumperla%2Fhyperas","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/maxpumperla%2Fhyperas/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/maxpumperla%2Fhyperas/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/maxpumperla%2Fhyperas/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/maxpumperla","download_url":"https://codeload.github.com/maxpumperla/hyperas/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":248345273,"owners_count":21088244,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["hyperopt","hyperparameter-optimization","keras"],"created_at":"2024-07-31T18:00:55.692Z","updated_at":"2025-04-11T04:57:36.335Z","avatar_url":"https://github.com/maxpumperla.png","language":"Python","readme":"# Hyperas [![Build Status](https://travis-ci.org/maxpumperla/hyperas.svg?branch=master)](https://travis-ci.org/maxpumperla/hyperas)  [![PyPI version](https://badge.fury.io/py/hyperas.svg)](https://badge.fury.io/py/hyperas)\nHyperas brings fast experimentation with Keras and hyperparameter optimization with Hyperopt together.\nIt lets you use the power of hyperopt without having to learn the syntax of it.\nInstead, just define your keras model as you are used to, but use a simple template notation to define hyper-parameter ranges to tune.\n\n## Installation\n```python\npip install hyperas\n```\n\n## Quick start\n\nAssume you have data generated as such\n\n```python\ndef data():\n    x_train = np.zeros(100)\n    x_test = np.zeros(100)\n    y_train = np.zeros(100)\n    y_test = np.zeros(100)\n    return x_train, y_train, x_test, y_test\n```\n\nand an existing keras model like the following\n\n```python\ndef create_model(x_train, y_train, x_test, y_test):\n    model = Sequential()\n    model.add(Dense(512, input_shape=(784,)))\n    model.add(Activation('relu'))\n    model.add(Dropout(0.2))\n    model.add(Dense(512))\n    model.add(Activation('relu'))\n    model.add(Dropout(0.2))\n    model.add(Dense(10))\n    model.add(Activation('softmax'))\n\n    # ... model fitting\n\n    return model\n```\n\n\nTo do hyper-parameter optimization on this model,\njust wrap the parameters you want to optimize into double curly brackets\nand choose a distribution over which to run the algorithm.\n\nIn the above example, let's say we want to optimize\nfor the best dropout probability in both dropout layers.\nChoosing a uniform distribution over the interval ```[0,1]```,\nthis translates into the following definition.\nNote that before returning the model, to optimize,\nwe also have to define which evaluation metric of the model is important to us.\nFor example, in the following, we optimize for accuracy.\n\n**Note**: In the following code we use `'loss': -accuracy`, i.e. the negative of accuracy. That's because under the hood `hyperopt` will always minimize whatever metric you provide. If instead you want to actually want to minimize a metric, say MSE or another loss function, you keep a positive sign (e.g. `'loss': mse`).\n\n\n```python\nfrom hyperas.distributions import uniform\n\ndef create_model(x_train, y_train, x_test, y_test):\n    model = Sequential()\n    model.add(Dense(512, input_shape=(784,)))\n    model.add(Activation('relu'))\n    model.add(Dropout({{uniform(0, 1)}}))\n    model.add(Dense(512))\n    model.add(Activation('relu'))\n    model.add(Dropout({{uniform(0, 1)}}))\n    model.add(Dense(10))\n    model.add(Activation('softmax'))\n\n    # ... model fitting\n\n    score = model.evaluate(x_test, y_test, verbose=0)\n    accuracy = score[1]\n    return {'loss': -accuracy, 'status': STATUS_OK, 'model': model}\n```\n\nThe last step is to actually run the optimization, which is done as follows:\n\n```python\nbest_run = optim.minimize(model=create_model,\n                          data=data,\n                          algo=tpe.suggest,\n                          max_evals=10,\n                          trials=Trials())\n```\nIn this example we use at most 10 evaluation runs and the TPE algorithm from hyperopt for optimization.\n\nCheck the \"complete example\" below for more details.\n\n\n## Complete example\n**Note:** It is important to wrap your data and model into functions as shown below, and then pass them as parameters to the minimizer. ```data()``` returns the data the ```create_model()``` needs. An extended version of the above example in one script reads as follows. This example shows many potential use cases of hyperas, including:\n- Varying dropout probabilities, sampling from a uniform distribution\n- Different layer output sizes\n- Different optimization algorithms to use\n- Varying choices of activation functions\n- Conditionally adding layers depending on a choice\n- Swapping whole sets of layers\n\n\n```python\nfrom __future__ import print_function\nimport numpy as np\n\nfrom hyperopt import Trials, STATUS_OK, tpe\nfrom keras.datasets import mnist\nfrom keras.layers.core import Dense, Dropout, Activation\nfrom keras.models import Sequential\nfrom keras.utils import np_utils\n\nfrom hyperas import optim\nfrom hyperas.distributions import choice, uniform\n\n\ndef data():\n    \"\"\"\n    Data providing function:\n\n    This function is separated from create_model() so that hyperopt\n    won't reload data for each evaluation run.\n    \"\"\"\n    (x_train, y_train), (x_test, y_test) = mnist.load_data()\n    x_train = x_train.reshape(60000, 784)\n    x_test = x_test.reshape(10000, 784)\n    x_train = x_train.astype('float32')\n    x_test = x_test.astype('float32')\n    x_train /= 255\n    x_test /= 255\n    nb_classes = 10\n    y_train = np_utils.to_categorical(y_train, nb_classes)\n    y_test = np_utils.to_categorical(y_test, nb_classes)\n    return x_train, y_train, x_test, y_test\n\n\ndef create_model(x_train, y_train, x_test, y_test):\n    \"\"\"\n    Model providing function:\n\n    Create Keras model with double curly brackets dropped-in as needed.\n    Return value has to be a valid python dictionary with two customary keys:\n        - loss: Specify a numeric evaluation metric to be minimized\n        - status: Just use STATUS_OK and see hyperopt documentation if not feasible\n    The last one is optional, though recommended, namely:\n        - model: specify the model just created so that we can later use it again.\n    \"\"\"\n    model = Sequential()\n    model.add(Dense(512, input_shape=(784,)))\n    model.add(Activation('relu'))\n    model.add(Dropout({{uniform(0, 1)}}))\n    model.add(Dense({{choice([256, 512, 1024])}}))\n    model.add(Activation({{choice(['relu', 'sigmoid'])}}))\n    model.add(Dropout({{uniform(0, 1)}}))\n\n    # If we choose 'four', add an additional fourth layer\n    if {{choice(['three', 'four'])}} == 'four':\n        model.add(Dense(100))\n\n        # We can also choose between complete sets of layers\n\n        model.add({{choice([Dropout(0.5), Activation('linear')])}})\n        model.add(Activation('relu'))\n\n    model.add(Dense(10))\n    model.add(Activation('softmax'))\n\n    model.compile(loss='categorical_crossentropy', metrics=['accuracy'],\n                  optimizer={{choice(['rmsprop', 'adam', 'sgd'])}})\n\n    result = model.fit(x_train, y_train,\n              batch_size={{choice([64, 128])}},\n              epochs=2,\n              verbose=2,\n              validation_split=0.1)\n    #get the highest validation accuracy of the training epochs\n    validation_acc = np.amax(result.history['val_acc']) \n    print('Best validation acc of epoch:', validation_acc)\n    return {'loss': -validation_acc, 'status': STATUS_OK, 'model': model}\n\n\nif __name__ == '__main__':\n    best_run, best_model = optim.minimize(model=create_model,\n                                          data=data,\n                                          algo=tpe.suggest,\n                                          max_evals=5,\n                                          trials=Trials())\n    X_train, Y_train, X_test, Y_test = data()\n    print(\"Evalutation of best performing model:\")\n    print(best_model.evaluate(X_test, Y_test))\n    print(\"Best performing model chosen hyper-parameters:\")\n    print(best_run)\n```\n\n## FAQ\n\nHere is a list of a few popular errors\n\n### `TypeError: require string label`\n\nYou're probably trying to execute the model creation code, with the templates, directly in python.\nThat fails simply because python cannot run the templating in the braces, e.g. `{{uniform..}}`.\nThe `def create_model(...)` function is in fact not a valid python function anymore.\n\nYou need to wrap your code in a `def create_model(...): ...` function,\nand then call it from `optim.minimize(model=create_model,...` like in the example.\n\nThe reason for this is that hyperas works by doing template replacement\nof everything in the `{{...}}` into a separate temporary file,\nand then running the model with the replaced braces (think jinja templating).\n\nThis is the basis of how hyperas simplifies usage of hyperopt by being a \"very simple wrapper\".\n\n\n### `TypeError: 'generator' object is not subscriptable`\n\nThis is currently a [known issue](https://github.com/maxpumperla/hyperas/issues/125).\n\nJust `pip install networkx==1.11`\n\n\n### `NameError: global name 'X_train' is not defined`\n\nMaybe you forgot to return the `x_train` argument in the `def create_model(x_train...)` call\nfrom the `def data(): ...` function.\n\nYou are not restricted to the same list of arguments as in the example.\nAny arguments you return from `data()` will be passed to `create_model()`\n\n### notebook adjustment\n\nIf you find error like [\"No such file or directory\"](https://github.com/maxpumperla/hyperas/issues/83) or [OSError, Err22](https://github.com/maxpumperla/hyperas/issues/149), you may need add `notebook_name='simple_notebook'`(assume your current notebook name is `simple_notebook`) in `optim.minimize` function like this:\n\n```python\nbest_run, best_model = optim.minimize(model=model,\n                                      data=data,\n                                      algo=tpe.suggest,\n                                      max_evals=5,\n                                      trials=Trials(),\n                                      notebook_name='simple_notebook')\n```\n\n### How does hyperas work?\n\nAll we do is parse the `data` and `model` templates and translate them into proper `hyperopt` by reconstructing the `space` object that's then passed to `fmin`. Most of the relevant code is found in [optim.py](https://github.com/maxpumperla/hyperas/blob/master/hyperas/optim.py) and [utils.py](https://github.com/maxpumperla/hyperas/blob/master/hyperas/utils.py).\n\n### How to read the output of a hyperas model?\n\nHyperas translates your script into `hyperopt` compliant code, see [here](https://github.com/maxpumperla/hyperas/issues/140) for some guidance on how to interpret the result.\n\n### How to pass arguments to data?\n\nSuppose you want your data function take an argument, specify it like this using positional arguments only (not keyword arguments):\n\n```python\nimport pickle\ndef data(fname):\n    with open(fname,'rb') as fh:\n        return pickle.load(fh)\n```\nNote that your arguments must be implemented such that `repr` can show them in their entirety (such as strings and numbers).\nIf you want more complex objects, use the passed arguments to build them inside the `data` function.\n\nAnd when you run your trials, pass a tuple of arguments to be substituted in as `data_args`:\n\n```python\nbest_run, best_model = optim.minimize(\n    model=model,\n    data=data,\n    algo=tpe.suggest,\n    max_evals=64,\n    trials=Trials(),\n    data_args=('my_file.pkl',)\n)\n``` \n\n### What if I need more flexibility loading data and adapting my model?\n\nHyperas is a convenience wrapper around Hyperopt that has some limitations. If it's not _convenient_ to use in your situation, simply don't use it -- and choose Hyperopt instead. All you can do with Hyperas you can also do with Hyperopt, it's just a different way of defining your model. If you want to squeeze some flexibility out of Hyperas anyway, take a look [here](https://github.com/maxpumperla/hyperas/issues/141).\n\n### Running hyperas in parallel?\n\nYou can use hyperas to run multiple models in parallel with the use of mongodb (which you'll need to install and setup users for).\n Here's a short example using MNIST:\n\n1. Copy and modify [`examples/mnist_distributed.py`](examples/mnist_distributed.py) (bump up `max_evals` if you like):\n2. Run `python mnist_distributed.py`. It will create a `temp_model.py` file. Copy this file to any machines that will be evaluating models.\n     It will then begin waiting for evaluation results\n3. On your other machines (make sure they have a python installed with all your dependencies, ideally with the same versions) run:\n    ```bash\n    export PYTHONPATH=/path/to/temp_model.py\n    hyperopt-mongo-worker --exp-key='mnist_test' --mongo='mongo://username:pass@mongodb.host:27017/jobs'\n    ```\n4. Once `max_evals` have been completed, you should get an output with your best model. You can also look through \n    your mongodb and examine the results, to get the best model out and run it, do:\n    \n    ```python\n    from pymongo import MongoClient\n    from keras.models import load_model\n    import tempfile\n    c = MongoClient('mongodb://username:pass@mongodb.host:27017/jobs')\n    best_model = c['jobs']['jobs'].find_one({'exp_key': 'mnist_test'}, sort=[('result.loss', -1)])\n    temp_name = tempfile.gettempdir()+'/'+next(tempfile._get_candidate_names()) + '.h5'\n    with open(temp_name, 'wb') as outfile:\n        outfile.write(best_model['result']['model_serial'])\n    model = load_model(temp_name)\n    ```\n","funding_links":[],"categories":["The Data Science Toolbox","Uncategorized","Python","Deep Learning","Deep Learning Framework","超参数优化和AutoML","参数优化","Tensor Flow","Frameworks","Deep Learning Tools","Hyperparameter Tuning"],"sub_categories":["Deep Learning Packages","Uncategorized","TensorFlow","Auto ML \u0026 Hyperparameter Optimization","Automated Machine Learning"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fmaxpumperla%2Fhyperas","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fmaxpumperla%2Fhyperas","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fmaxpumperla%2Fhyperas/lists"}