{"id":19689933,"url":"https://github.com/guybedo/minos","last_synced_at":"2025-04-12T15:12:08.994Z","repository":{"id":57456146,"uuid":"81164816","full_name":"guybedo/minos","owner":"guybedo","description":"Deep learning, architecture and hyper parameters search  with genetic algorithms","archived":false,"fork":false,"pushed_at":"2024-08-02T16:09:07.000Z","size":158,"stargazers_count":52,"open_issues_count":2,"forks_count":10,"subscribers_count":5,"default_branch":"develop","last_synced_at":"2025-04-12T15:12:02.557Z","etag":null,"topics":["deep-learning","genetic-algorithm","hyperparameters","keras-models","python","tensorflow"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/guybedo.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2017-02-07T04:04:05.000Z","updated_at":"2024-12-30T22:22:54.000Z","dependencies_parsed_at":"2024-12-30T05:22:47.983Z","dependency_job_id":"d8683e8d-652d-4cc0-a4da-01a5774e1fb2","html_url":"https://github.com/guybedo/minos","commit_stats":{"total_commits":150,"total_committers":3,"mean_commits":50.0,"dds":"0.013333333333333308","last_synced_commit":"4e4f0ac328bee6d668ab60db868d6b03db642b82"},"previous_names":[],"tags_count":16,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/guybedo%2Fminos","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/guybedo%2Fminos/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/guybedo%2Fminos/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/guybedo%2Fminos/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/guybedo","download_url":"https://codeload.github.com/guybedo/minos/tar.gz/refs/heads/develop","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":248586244,"owners_count":21128998,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["deep-learning","genetic-algorithm","hyperparameters","keras-models","python","tensorflow"],"created_at":"2024-11-11T19:03:42.830Z","updated_at":"2025-04-12T15:12:08.972Z","avatar_url":"https://github.com/guybedo.png","language":"Python","readme":"# Minos\n\nSearch for neural networks architecture \u0026 hyper parameters with genetic algorithms.\nIt is built on top of Keras+Tensorflow to build/train/evaluate the models, and uses DEAP for the genetic algorithms.\n\n## Getting Started\n\nYou need to have tensorflow installed, see [tensorflow linux](requirements-tensorflow-linux.txt) or [tensorflow mac](requirements-tensorflow-mac.txt)\n\nInstall minos:\n```\npip install pyminos==0.5.1\n```\n\nTo run an experiment and search hyper parameters and/or architecture for a model and dataset, you can define a simple layout\nwith the input_size, output_size and output_activation of your model\n\n```python\nfrom minos.model.model import Layout\nlayout = Layout(\n    input_size=1000,\n    output_size=25,\n    output_activation='softmax')\n```\n\nThen you define the parameters of the training. If you specify only the name of the optimizer to use, and no parameters, random parameters will be tested during the experiment, hopefully converging  to optimal parameters.\nYou can choose to stop the training after a fixed number of epochs, or when the accuracy of the model evaluated stops increasing.\n\n```python\nfrom minos.model.model import Objective, Optimizer, Metric\nfrom minos.experiment.training import Training, EpochStoppingCondition\ntraining = Training(\n    objective=Objective('categorical_crossentropy'),\n    optimizer=Optimizer(optimizer='Adam'),\n    metric=Metric('categorical_accuracy'),\n    stopping=EpochStoppingCondition(10),\n    batch_size=50)\n```\n\nNow you need to define which parameters will be randomly tested. \nAn ExperimentParameters contains all the parameters that can be tested. It can be initialized with the default values for each parameter so that you only redefine the parameters you want to test, specifying intervals or list of values for example\n\n```python\nfrom minos.experiment.experiment import ExperimentParameters\nexperiment_parameters = ExperimentParameters(use_default_values=True)\n```\n\nYou can then specify the search space for each parameter you want to test.\nFor example, to test architectures with 1 row, 1 block per row, and up to 5 layers per block: \n\n```python\nfrom minos.model.parameter import int_param\nexperiment_parameters.layout_parameter('rows', 1)\nexperiment_parameters.layout_parameter('blocks', 1)\nexperiment_parameters.layout_parameter('layers', int_param(1, 5))\n```\n\nIf you want to test layers with size between 10 and 500 units:\n```python\nexperiment_parameters.layer_parameter('Dense.output_dim', int_param(10, 500))\n```\n\nYou can find all the parameters and their default values here in [parameters] (minos/model/parameters.py)\n\nNow you need to specify the experiment environment. \nYou can choose to run the experiment on CPU or GPU devices, and specify how many jobs are to be run on each device. To run on CPU, just use CpuEnvironment instead of GpuEnvironment.\nYou can define the directory where the experiment logs and data are saved. If no directory defined, it will create a directory named 'minos' in the user's home.\n\n```python\nfrom minos.train.utils import GpuEnvironment\nenvironment=GpuEnvironment(\n    ['/gpu:0', '/gpu:1'], \n    n_jobs=[2, 5],\n    data_dir='/data/minos/experiments')\n```\n\nThe Experiment is then created with all the information necessary and the training and validation data.\nTraining and validation data are provided as batch iterators that generate (X,y) tuples.\nYou can use SimpleBatchIterator to create a batch iterator from (X, y) arrays. The iterators need to be able to loop over the data when they reach the end, so you need to set the parameter autoloop=True.\n\n```python\nfrom minos.train.utils import SimpleBatchIterator\nbatch_iterator = SimpleBatchIterator(X, y, batch_size=50, autoloop=True)\ntest_batch_iterator = SimpleBatchIterator(test_X, test_y, batch_size=50, autoloop=True)\nfrom minos.experiment.experiment import Experiment\nexperiment = Experiment(\n    experiment_label='test__reuters_experiment',\n    layout=layout,\n    training=training,\n    batch_iterator=batch_iterator,\n    test_batch_iterator=test_batch_iterator,\n    environment=environment,\n    parameters=experiment_parameters)\n```\n\nThen you specify the population size and number of generations and start the experiment.\nLogs and data will be saved in the directory you specified.\n\n```python\nfrom minos.experiment.ga import run_ga_search_experiment\nrun_ga_search_experiment(\n    experiment, \n    population_size=100, \n    generations=100,\n    log_level='DEBUG')\n```\n\nLogs and data will be saved in the specified directory, or ~/minos if no directory specified.\nThis is what the logs should look like \n```terminal\n2017-02-21 07:25:26 [INFO] root: Evolving generation 0\n2017-02-21 07:25:26 [DEBUG] root: Training 100 models\n2017-02-21 07:27:29 [DEBUG] root: Blueprint 0: score 0.438252 after 17 epochs\n2017-02-21 07:27:31 [DEBUG] root: Blueprint 1: score 0.326195 after 20 epochs\n2017-02-21 07:28:13 [DEBUG] root: Blueprint 3: score 0.496040 after 22 epochs\n2017-02-21 07:29:26 [DEBUG] root: Blueprint 4: score 0.835436 after 24 epochs\n2017-02-21 07:30:18 [DEBUG] root: Blueprint 5: score 0.261954 after 21 epochs\n2017-02-21 07:31:02 [DEBUG] root: Blueprint 2: score 0.096509 after 51 epochs\n2017-02-21 07:35:14 [DEBUG] root: Blueprint 7: score 0.370490 after 36 epochs\n2017-02-21 07:38:12 [DEBUG] root: Blueprint 6: score 0.537401 after 104 epochs\n2017-02-21 07:40:25 [DEBUG] root: Blueprint 8: score 0.176298 after 57 epochs\n2017-02-21 07:41:08 [DEBUG] root: Blueprint 11: score 0.063068 after 24 epochs\n2017-02-21 07:45:55 [DEBUG] root: Blueprint 10: score 0.022587 after 65 epoch\n2017-02-21 10:02:29 [INFO] root: [{\"generation\": 0}, {\"average\": 0.36195365556387343}, {\"best_scores\": [0.842769172996606, 0.8392491032735243, 0.8354356464279401]}]\n```\n\nYou can stop the experiment and resume later by setting the 'resume' parameter to True. It will restart at the last epoch saved. \n```python\nrun_ga_search_experiment(\n    experiment, \n    population_size=100, \n    generations=100,\n    resume=True)\n```\n\nOnce you are done, you can load the best blueprint produced at a specific step.\n```python\nfrom minos.experiment.experiment import load_experiment_best_blueprint\nblueprint = load_experiment_best_blueprint(\n    experiment_label=experiment.label,\n    step=generations - 1,\n    environment=CpuEnvironment(n_jobs=2, data_dir=tmp_dir))\n```    \n    \nAnd then build/train/evaluate the model using the Keras API:\n```python\nfrom minos.model.build import ModelBuilder\nfrom minos.train.utils import cpu_device\nmodel = ModelBuilder().build(\n    blueprint,\n    cpu_device())\nmodel.fit_generator(\n    generator=batch_iterator,\n    samples_per_epoch=batch_iterator.samples_per_epoch,\n    nb_epoch=5,\n    validation_data=test_batch_iterator,\n    nb_val_samples=test_batch_iterator.sample_count)\nscore = model.evaluate_generator(\n    test_batch_iterator,\n    val_samples=test_batch_iterator.sample_count)\n```\n\n## Limitations\nThe current version only works with 1D data, so no RNN, LSTM, Convolutions for now...\n\n\n## Concepts\nTo search for hyper parameters and/or layouts, we create an experiment.\nWe define the parameters of the experiment and the dataset, then we run the experiment.\nAn experiment uses a genetic algorithm to search the parameters defined.\nIt consists in generating a population, and evolving the population for a specified number\nof generations.\nIt starts by generating a random population of blueprints from the experiment parameters.\nEach blueprint, or individual, randomly generated, is actually a definition that can be used\nto build a Keras model.\nAt each generation, the blueprints can be mixed and/or mutated, and are then evaluated.\nEvaluating a blueprint consists in building, training and evaluating the Keras model if defines.\nThe best blueprints are selected for the next generation\n\nTo create an experiment you need to define:\n- the layout:\n    input_size, output_size, output_activtion of the network.\n      You can also specify the architecture and layers if you want to search parameters\n      for a fixed architecture.\n      If you don't specify any layers, random combinations will be tested.\n- the experiment parameters:\n    these are all the parameters that will be randomly tested\n      You can decide to test every possible combination, or fix the value of some parameters and\n      let the experiment randomly test others\n- the training:\n    objective(=loss), metric, stopping condition and optimizer.\n    These training parameters are used to evaluate the models randomly generated.\n      Note that you can either fully specify the optimizer (type+parameters) or specify only\n      a type of optimizer and let the experiment test random parameters\n\n## Terminology\n\n    Experiment:\n        Defines all the parameters related to the search:\n            - the layout,\n            - the layer parameters\n            - the training parameters\n    Layout:\n        A layout defines the architecture of a network. A layout is vertical stack of rows.\n    Row:\n        A row is an horizontal stack of independant blocks. Each block can be connected to or more blocks\n        from the row below.\n    Block:\n        A block is vertical stack of layers. The output of each layer in the block is the input of the\n        layer immediately above\n    Layer:\n        A Keras layer : Dense, Dropout, ...\n\n    ExperimentParameters:\n        Defines all the parameters that can be tested. This can be layer parameters such as the dropout value,\n        the regularization type and value, etc... this can be the parameters of the optimizer...\n        You can :\n            - initialize the ExperimentParameters with the default values for each parameters.\n              In that case you then need to override the parameters you want to search and specify\n              the intervals or collections of values to be randomly tested\n            - initialize the ExperimentParameters without default values.\n              In that case all the parameters will be randomly tested\n        The reference parameter intervals and default values can be found in minos.model.parameters\n\n    Training:\n        Defines the training parameters used to evaluate the randomly generated models.\n        You specify the objective(loss), the metric, the stopping condition and the optimizer.\n        The hyper parameters for the optimizers can also be randomly tested\n\n\n    Blueprint:\n        Blueprints are generated randomly from the experiment parameters you specify.\n        A blueprint is the definition that is used to build and train/evaluate a Keras model.\n        During the experiment random blueprints are generated, mixed, mutated and evaluated\n        by training and evaluating the Keras model they define\n\n    Model:\n        A Keras model built using a blueprint\n\n## Documentation\nFor now there is no documentation. Best thing to do is to have a look at the examples in https://github.com/guybedo/minos/tree/develop/examples.\nThis is quite straightforward to use, the examples should be enough to start trying things and running experiments.\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fguybedo%2Fminos","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fguybedo%2Fminos","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fguybedo%2Fminos/lists"}