{"id":13801686,"url":"https://github.com/openmole/mgo","last_synced_at":"2025-04-07T11:07:35.755Z","repository":{"id":4461861,"uuid":"5600578","full_name":"openmole/mgo","owner":"openmole","description":"Purely functional genetic algorithms for multi-objective optimisation","archived":false,"fork":false,"pushed_at":"2025-03-08T08:02:07.000Z","size":3645,"stargazers_count":72,"open_issues_count":9,"forks_count":5,"subscribers_count":13,"default_branch":"master","last_synced_at":"2025-03-31T09:08:56.663Z","etag":null,"topics":["functional-programming","genetic-algorithm","hyperparameter-optimization","hyperparameter-tuning","hyperparameters","optimisation","parameter-tuning","scala"],"latest_commit_sha":null,"homepage":"","language":"Scala","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/openmole.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2012-08-29T13:04:39.000Z","updated_at":"2025-03-08T08:02:08.000Z","dependencies_parsed_at":"2024-01-30T22:35:49.885Z","dependency_job_id":"e4b2ebad-c1bf-4df1-ac7c-c6240de19ac9","html_url":"https://github.com/openmole/mgo","commit_stats":{"total_commits":1167,"total_committers":10,"mean_commits":116.7,"dds":"0.23479005998286206","last_synced_commit":"6832c733815318cf21aa6d1787e398f72b1639fb"},"previous_names":[],"tags_count":141,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/openmole%2Fmgo","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/openmole%2Fmgo/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/openmole%2Fmgo/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/openmole%2Fmgo/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/openmole","download_url":"https://codeload.github.com/openmole/mgo/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":247640462,"owners_count":20971557,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["functional-programming","genetic-algorithm","hyperparameter-optimization","hyperparameter-tuning","hyperparameters","optimisation","parameter-tuning","scala"],"created_at":"2024-08-04T00:01:25.907Z","updated_at":"2025-04-07T11:07:35.739Z","avatar_url":"https://github.com/openmole.png","language":"Scala","readme":"MGO\n===\n\nMGO is a purely functionnal scala library based for evolutionary / genetic algorithms:\n* enforcing immutability,\n* exposes a modular and extensible architecture,\n* implements state of the art algorithms,\n* handles noisy (stochastic) fitness functions,\n* implements auto-adaptatative algortihms,\n* implements algorithms with distributed computing in mind for integration with [OpenMOLE](http://openmole.org).\n\nMGO implements NGSAII, NSGA3, CP (Calibration Profile), PSE (Pattern Search Experiment), OSE (Antecedant research), Niched Evolution, ABC (Bayesian Calibration).\n\nLicence\n-------\n\nMGO is licenced under the GNU GPLv3 software licence. \n\nExample\n-------\n\nDefine a problem, for instance the multi-modal multi-objective ZDT4 benchmark:\n\n```scala\n\n  object zdt4 {\n\n    def continuous(size: Int) = Vector.fill(size)(C(0.0, 5.0))\n    \n    def compute(genome: Vector[Double], d: Vector[Int]): Vector[Double] = {\n      val genomeSize = genome.size\n\n      def g(x: Seq[Double]) = 1 + 10 * (genomeSize - 1) + x.map { i =\u003e pow(i, 2) - 10 * cos(4 * Pi * i) }.sum\n\n      def f(x: Seq[Double]) = {\n        val gx = g(x)\n        gx * (1 - sqrt(genome(0) / gx))\n      }\n\n      Vector(genome(0), f(genome.tail))\n    }\n\n }\n\n```\n\nDefine the optimisation algorithm, for instance NSGAII:\n\n```scala\n\n  import mgo.evolution._\n  import mgo.evolution.algorithm._\n  \n  // For zdt4\n  import mgo.test._\n\n  val nsga2 =\n    NSGA2(\n      mu = 100,\n      lambda = 100,\n      fitness = zdt4.compute,\n      continuous = zdt4.continuous(10))\n\n```\n\nRun the optimisation:\n\n```scala\n\n  def evolution =\n    nsga2.\n      until(afterGeneration(1000)).\n      trace((s, is) =\u003e println(s.generation))\n\n  val (finalState, finalPopulation) = evolution.eval(new util.Random(42))\n\n  println(NSGA2.result(nsga2, finalPopulation).mkString(\"\\n\"))\n  \n```\n\nNoisy fitness functions\n-----------------------\n\nAll algorithm in MGO have version to compute on noisy fitness function. MGO handle noisy fitness functions by resampling\nonly the most promising individuals. It uses an aggregation function to aggregate the multiple sample when needed.\n\nFor instance a version of NSGA2 for noisy fitness functions may be used has follow:\n\n```scala\n  import mgo._\n  import algorithm.noisynsga2._\n  import context.implicits._\n\n  object sphere {\n    def scale(s: Vector[Double]): Vector[Double] = s.map(_.scale(-2, 2))\n    def compute(i: Vector[Double]): Double = i.map(x =\u003e x * x).sum\n  }\n\n  object noisySphere {\n    def scale(s: Vector[Double]): Vector[Double] = sphere.scale(s)\n    def compute(rng: util.Random, v: Vector[Double]) =\n      sphere.compute(v) + rng.nextGaussian() * 0.5 * math.sqrt(sphere.compute(v))\n  }\n\n  def aggregation(history: Vector[Vector[Double]]) = history.transpose.map { o =\u003e o.sum / o.size }\n\n  val nsga2 =\n    NoisyNSGA2(\n      mu = 100,\n      lambda = 100,\n      fitness = (rng, v) =\u003e Vector(noisySphere.compute(rng, v)),\n      aggregation = aggregation,\n      genomeSize = 2)\n\n  val (finalState, finalPopulation) =\n    run(nsga2).\n      until(afterGeneration(1000)).\n      trace((s, is) =\u003e println(s.generation)).\n      eval(new util.Random(42))\n\n  println(result(finalPopulation, aggregation, noisySphere.scale).mkString(\"\\n\"))\n```\n\nDiversity only\n--------------\n\nMGO proposes the PSE alorithm that aim a creating diverse solution instead of optimsing a function. The paper about this\nalgorithm can be found [here](http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0138212).\n\n```scala\n  import mgo._\n  import algorithm.pse._\n  import context.implicits._\n\n  val pse = PSE(\n    lambda = 10,\n    phenotype = zdt4.compute,\n    pattern =\n      boundedGrid(\n        lowBound = Vector(0.0, 0.0),\n        highBound = Vector(1.0, 200.0),\n        definition = Vector(10, 10)),\n    genomeSize = 10)\n\n  val (finalState, finalPopulation) =\n    run(pse).\n      until(afterGeneration(1000)).\n      trace((s, is) =\u003e println(s.generation)).\n      eval(new util.Random(42))\n\n  println(result(finalPopulation, zdt4.scale).mkString(\"\\n\"))\n```\n\nThis program explores all the different combination of values that can be produced by the multi-objective function of ZDT4.\n\nFor more examples, have a look at the main/scala/fr/iscpif/mgo/test directory in the repository.\n\nMixed optimisation and diversity\n--------------------------------\n\nThe calibration profile algorthim compute the best fitness function for a set of niches. This algorithm is explained [here](http://jasss.soc.surrey.ac.uk/18/1/12.html).\n\nIn MGO you can compute profiles of a 10 dimensional hyper-sphere function using the following:\n\n```scala\n\n  import algorithm.profile._\n  import context.implicits._\n\n  //Profile the first dimension of the genome\n  val algo = Profile(\n    lambda = 100,\n    fitness = sphere.compute,\n    niche = genomeProfile(x = 0, nX = 10),\n    genomeSize = 10)\n\n  val (finalState, finalPopulation) =\n    run(algo).\n      until(afterGeneration(1000)).\n      trace((s, is) =\u003e println(s.generation)).\n      eval(new util.Random(42))\n\n  println(result(finalPopulation, sphere.scale).mkString(\"\\n\"))\n```\n\nNoisy profiles\n--------------\n\nAll algorithms in MGO have a pendant for noisy fitness function. Here is an example of a profile computation for a sphere\nfunction with noise.\n\n```scala\n  import algorithm.noisyprofile._\n  import context.implicits._\n\n  def aggregation(history: Vector[Double]) = history.sum / history.size\n  def niche = genomeProfile(x = 0, nX = 10)\n\n  val algo = NoisyProfile(\n    muByNiche = 20,\n    lambda = 100,\n    fitness = noisySphere.compute,\n    aggregation = aggregation,\n    niche = niche,\n    genomeSize = 5)\n\n  val (finalState, finalPopulation) =\n    run(algo).\n      until(afterGeneration(1000)).\n      trace((s, is) =\u003e println(s.generation)).\n      eval(new util.Random(42))\n\n  println(result(finalPopulation, aggregation, noisySphere.scale, niche).mkString(\"\\n\"))\n\n```\n\nDistributed computing\n---------------------\n\nAlgorithms implemented in MGO are also avialiable in the workflow plateform for distributed computing [OpenMOLE](http://openmole.org).\n  \nSBT dependency\n----------------\n```scala\n  libraryDependencies += \"fr.iscpif\" %% \"mgo\" % \"2.45\"  \n```\n","funding_links":[],"categories":["Table of Contents","Science and Data Analysis","人工智能"],"sub_categories":["Science and Data Analysis","机器学习"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fopenmole%2Fmgo","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fopenmole%2Fmgo","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fopenmole%2Fmgo/lists"}