{"id":15559754,"url":"https://github.com/100/solid","last_synced_at":"2025-04-04T14:03:42.200Z","repository":{"id":45039912,"uuid":"94057044","full_name":"100/Solid","owner":"100","description":"🎯 A comprehensive gradient-free optimization framework written in Python","archived":false,"fork":false,"pushed_at":"2019-07-19T15:31:43.000Z","size":274,"stargazers_count":576,"open_issues_count":7,"forks_count":64,"subscribers_count":12,"default_branch":"master","last_synced_at":"2024-10-13T09:06:42.261Z","etag":null,"topics":["algorithm","artificial-intelligence","continuous-optimization","discrete-optimization","evolutionary-algorithm","genetic-algorithm","genetic-algorithm-framework","harmony-search","hill-climbing","library","machine-learning","machine-learning-algorithms","metaheuristics","optimization","optimization-algorithms","particle-swarm-optimization","python","simulated-annealing","stochastic-optimizers","tabu-search"],"latest_commit_sha":null,"homepage":"https://100.github.io/Solid/","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/100.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null}},"created_at":"2017-06-12T05:02:08.000Z","updated_at":"2024-07-15T20:26:02.000Z","dependencies_parsed_at":"2022-09-19T14:10:33.431Z","dependency_job_id":null,"html_url":"https://github.com/100/Solid","commit_stats":null,"previous_names":[],"tags_count":2,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/100%2FSolid","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/100%2FSolid/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/100%2FSolid/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/100%2FSolid/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/100","download_url":"https://codeload.github.com/100/Solid/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":247190236,"owners_count":20898700,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["algorithm","artificial-intelligence","continuous-optimization","discrete-optimization","evolutionary-algorithm","genetic-algorithm","genetic-algorithm-framework","harmony-search","hill-climbing","library","machine-learning","machine-learning-algorithms","metaheuristics","optimization","optimization-algorithms","particle-swarm-optimization","python","simulated-annealing","stochastic-optimizers","tabu-search"],"created_at":"2024-10-02T15:56:49.451Z","updated_at":"2025-04-04T14:03:42.183Z","avatar_url":"https://github.com/100.png","language":"Python","readme":"\u003cdiv align=\"center\"\u003e\n    \u003cimg src=\"logo.png\"\u003e\u003c/img\u003e\n\u003c/div\u003e\n\n\u003cbr\u003e\n\n[![Build Status](https://travis-ci.org/100/Solid.svg?branch=master)](https://travis-ci.org/100/Solid)\n[![MIT License](https://img.shields.io/dub/l/vibe-d.svg)](https://github.com/100/Cranium/blob/master/LICENSE)\n\n## *Solid* is a Python framework for gradient-free optimization.\n\n#### It contains basic versions of many of the most common [optimization algorithms that do not require the calculation of gradients](https://en.wikipedia.org/wiki/Derivative-free_optimization), and allows for very rapid development using them.\n\n#### It's a very versatile library that's great for learning, modifying, and of course, using out-of-the-box.\n\n## See the detailed documentation [here](https://100.github.io/Solid/).\n\n\u003chr\u003e\n\n## Current Features:\n* [Genetic Algorithm](https://github.com/100/Solid/blob/master/Solid/GeneticAlgorithm.py)\n* [Evolutionary Algorithm](https://github.com/100/Solid/blob/master/Solid/EvolutionaryAlgorithm.py)\n* [Simulated Annealing](https://github.com/100/Solid/blob/master/Solid/SimulatedAnnealing.py)\n* [Particle Swarm Optimization](https://github.com/100/Solid/blob/master/Solid/ParticleSwarm.py)\n* [Tabu Search](https://github.com/100/Solid/blob/master/Solid/TabuSearch.py)\n* [Harmony Search](https://github.com/100/Solid/blob/master/Solid/HarmonySearch.py)\n* [Stochastic Hill Climb](https://github.com/100/Solid/blob/master/Solid/StochasticHillClimb.py)\n\n\u003chr\u003e\n\n## Usage:\n* ```pip install solidpy``` \n* Import the relevant algorithm\n* Create a class that inherits from that algorithm, and that implements the necessary abstract methods\n* Call its ```.run()``` method, which always returns the best solution and its objective function value\n\n\u003chr\u003e\n\n## Example:\n\n```python\nfrom random import choice, randint, random\nfrom string import lowercase\nfrom Solid.EvolutionaryAlgorithm import EvolutionaryAlgorithm\n\n\nclass Algorithm(EvolutionaryAlgorithm):\n    \"\"\"\n    Tries to get a randomly-generated string to match string \"clout\"\n    \"\"\"\n    def _initial_population(self):\n        return list(''.join([choice(lowercase) for _ in range(5)]) for _ in range(50))\n\n    def _fitness(self, member):\n        return float(sum(member[i] == \"clout\"[i] for i in range(5)))\n\n    def _crossover(self, parent1, parent2):\n        partition = randint(0, len(self.population[0]) - 1)\n        return parent1[0:partition] + parent2[partition:]\n\n    def _mutate(self, member):\n        if self.mutation_rate \u003e= random():\n            member = list(member)\n            member[randint(0,4)] = choice(lowercase)\n            member = ''.join(member)\n        return member\n\n\ndef test_algorithm():\n    algorithm = Algorithm(.5, .7, 500, max_fitness=None)\n    best_solution, best_objective_value = algorithm.run()\n\n```\n\n\u003chr\u003e\n\n## Testing\n\nTo run tests, look in the ```tests``` folder. \n\nUse [pytest](https://docs.pytest.org/en/latest/); it should automatically find the test files. \n\n\u003chr\u003e\n\n## Contributing\n\nFeel free to send a pull request if you want to add any features or if you find a bug.\n\nCheck the issues tab for some potential things to do.\n\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2F100%2Fsolid","html_url":"https://awesome.ecosyste.ms/projects/github.com%2F100%2Fsolid","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2F100%2Fsolid/lists"}