{"id":19180018,"url":"https://github.com/pierreablin/ksddescent","last_synced_at":"2025-05-07T22:05:58.369Z","repository":{"id":135723273,"uuid":"356933514","full_name":"pierreablin/ksddescent","owner":"pierreablin","description":"Kernel Stein Discrepancy Descent : a method to sample from unnormalized densities","archived":false,"fork":false,"pushed_at":"2024-04-13T19:27:50.000Z","size":7548,"stargazers_count":21,"open_issues_count":0,"forks_count":4,"subscribers_count":1,"default_branch":"main","last_synced_at":"2025-03-31T14:36:33.885Z","etag":null,"topics":["sampling"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/pierreablin.png","metadata":{"files":{"readme":"README.rst","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2021-04-11T17:27:07.000Z","updated_at":"2024-08-09T19:01:57.000Z","dependencies_parsed_at":"2024-11-09T10:47:28.581Z","dependency_job_id":"1209526c-fc9c-4611-933a-0b099c5e3058","html_url":"https://github.com/pierreablin/ksddescent","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/pierreablin%2Fksddescent","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/pierreablin%2Fksddescent/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/pierreablin%2Fksddescent/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/pierreablin%2Fksddescent/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/pierreablin","download_url":"https://codeload.github.com/pierreablin/ksddescent/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":249845518,"owners_count":21333728,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["sampling"],"created_at":"2024-11-09T10:47:22.475Z","updated_at":"2025-04-20T03:33:41.546Z","avatar_url":"https://github.com/pierreablin.png","language":"Python","readme":"Kernel Stein Discrepancy Descent\n================================\n\n|GHActions|_ |PyPI|_ \n\n.. |GHActions| image:: https://github.com/pierreablin/ksddescent/workflows/unittests/badge.svg?branch=main\u0026event=push\n.. _GHActions: https://github.com/pierreablin/ksddescent/actions\n\n\n.. |PyPI| image:: https://badge.fury.io/py/ksddescent.svg\n.. _PyPI: https://badge.fury.io/py/ksddescent\n\nSampling by optimization of the Kernel Stein Discrepancy\n\nThe paper is available at `arxiv.org/abs/2105.09994 \u003chttps://arxiv.org/abs/2105.09994\u003e`_.\n\nThe code uses Pytorch, and a numpy backend is available for svgd.\n\n\n.. image:: https://pierreablin.github.io/figures/ksd_descent_small.png\n    :width: 100\n    :alt: ksd_picture\n\n\nInstall\n-------\n\nThe code is available on pip::\n\n\t$ pip install ksddescent\n\n\nDocumentation\n-------------\n\nThe documentation is at `pierreablin.github.io/ksddescent/ \u003chttps://pierreablin.github.io/ksddescent/\u003e`_.\n\nExample\n-------\n\nThe main function is `ksdd_lbfgs`, which uses the fast L-BFGS algorithm to converge quickly.\nIt takes as input the initial position of the particles, and the score function.\nFor instance, to samples from a Gaussian (where the score is identity), you can use these simple lines of code:\n\n.. code:: python\n\n   \u003e\u003e\u003e import torch\n   \u003e\u003e\u003e from ksddescent import ksdd_lbfgs\n   \u003e\u003e\u003e n, p = 50, 2\n   \u003e\u003e\u003e x0 = torch.rand(n, p)  # start from uniform distribution\n   \u003e\u003e\u003e score = lambda x: x  # simple score function\n   \u003e\u003e\u003e x = ksdd_lbfgs(x0, score)  # run the algorithm\n\n\nReference\n---------\n\nIf you use this code in your project, please cite::\n\n    Anna Korba, Pierre-Cyril Aubin-Frankowski, Simon Majewski, Pierre Ablin\n    Kernel Stein Discrepancy Descent\n    International Conference on Machine Learning, 2021\n\n\n\n\n\nBug reports\n-----------\n\nUse the `github issue tracker \u003chttps://github.com/pierreablin/ksddescent/issues\u003e`_ to report bugs.\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fpierreablin%2Fksddescent","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fpierreablin%2Fksddescent","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fpierreablin%2Fksddescent/lists"}