{"id":13689447,"url":"https://github.com/probtorch/probtorch","last_synced_at":"2025-05-01T23:34:40.477Z","repository":{"id":43650555,"uuid":"106852380","full_name":"probtorch/probtorch","owner":"probtorch","description":"Probabilistic Torch is library for deep generative models that extends PyTorch","archived":false,"fork":false,"pushed_at":"2024-05-12T16:54:53.000Z","size":17681,"stargazers_count":884,"open_issues_count":13,"forks_count":66,"subscribers_count":32,"default_branch":"master","last_synced_at":"2024-08-03T15:17:51.341Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":null,"language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/probtorch.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null}},"created_at":"2017-10-13T17:11:32.000Z","updated_at":"2024-07-16T08:50:33.000Z","dependencies_parsed_at":"2022-08-29T11:00:20.935Z","dependency_job_id":null,"html_url":"https://github.com/probtorch/probtorch","commit_stats":null,"previous_names":[],"tags_count":1,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/probtorch%2Fprobtorch","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/probtorch%2Fprobtorch/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/probtorch%2Fprobtorch/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/probtorch%2Fprobtorch/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/probtorch","download_url":"https://codeload.github.com/probtorch/probtorch/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":224282284,"owners_count":17285798,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2024-08-02T15:01:48.262Z","updated_at":"2024-11-12T13:31:48.058Z","avatar_url":"https://github.com/probtorch.png","language":"Python","readme":"\u003cdiv align=\"center\"\u003e\n  \u003cimg height=\"150px\" src=\"docs/source/_static/img/probtorch-logo.png\"\u003e\u003c/a\u003e\n\u003c/div\u003e\n\nProbabilistic Torch is library for deep generative models that extends [PyTorch](http://pytorch.org). It is similar in spirit and design goals to [Edward](http://edwardlib.org) and [Pyro](https://github.com/uber/pyro), sharing many design characteristics with the latter.\n\nThe design of Probabilistic Torch is intended to be as PyTorch-like as possible. Probabilistic Torch models are written just like you would write any PyTorch model, but make use of three additional constructs:\n\n1. A library of *reparameterized* [distributions](https://pytorch.org/docs/stable/distributions.html) that implement methods for sampling and evaluation of the log probability mass and density functions (now available in PyTorch)\n\n2. A [Trace](https://github.com/probtorch/probtorch/blob/master/probtorch/stochastic.py#L119) data structure, which is both used to instantiate and store random variables.\n\n3. Objective functions that approximate the lower bound on the log marginal likelihood using [Monte Carlo](https://github.com/probtorch/probtorch/blob/master/probtorch/objectives/montecarlo.py) and [Importance-weighted](https://github.com/probtorch/probtorch/blob/master/probtorch/objectives/importance.py) estimators.\n\nThis repository accompanies the NIPS 2017 paper:\n\n```latex\n@inproceedings{siddharth2017learning,\n    title = {Learning Disentangled Representations with Semi-Supervised Deep Generative Models},\n    author = {Siddharth, N. and Paige, Brooks and van de Meent, Jan-Willem and Desmaison, Alban and Goodman, Noah D. and Kohli, Pushmeet and Wood,\nFrank and Torr, Philip},\n    booktitle = {Advances in Neural Information Processing Systems 30},\n    editor = {I. Guyon and U. V. Luxburg and S. Bengio and H. Wallach and R. Fergus and S. Vishwanathan and R. Garnett},\n    pages = {5927--5937},\n    year = {2017},\n    publisher = {Curran Associates, Inc.},\n    url = {http://papers.nips.cc/paper/7174-learning-disentangled-representations-with-semi-supervised-deep-generative-models.pdf}\n}\n```\n\n\n# Contributors\n\n(in order of joining)\n\n- Jan-Willem van de Meent\n- Siddharth Narayanaswamy\n- Brooks Paige\n- Alban Desmaison\n- Alican Bozkurt\n- Amirsina Torfi\n- Babak Esmaeili\n- Eli Sennesh\n\n\n# Installation\n\n1. Install PyTorch [[instructions](https://github.com/pytorch/pytorch)]\n\n2. Install this repository from source\n```\npip install git+git://github.com/probtorch/probtorch\n```\n\n3. Refer to the `examples/` subdirectory for [Jupyter](http://jupyter.org) notebooks that illustrate usage.\n\n4. To build and read the API documentation, please do the following\n```\ngit clone git://github.com/probtorch/probtorch\ncd probtorch/docs\npip install -r requirements.txt\nmake html\nopen build/html/index.html\n```\n\n\n# Mini-Tutorial: Semi-supervised MNIST\n\nModels in Probabilistic Torch define variational autoencoders. Both the encoder and the decoder model can be implemented as standard PyTorch models that subclass `nn.Module`.\n\nIn the `__init__` method we initialize network layers, just as we would in a PyTorch model. In the `forward` method, we additionally initialize a `Trace` variable, which is a write-once dictionary-like object. The `Trace` data structure implements methods for instantiating named random variables, whose values and log probabilities are stored under the specifed key.\n\nHere is an implementation for the encoder of a standard semi-supervised VAE, as introduced by Kingma and colleagues [1]\n\n```python\nimport torch\nimport torch.nn as nn\nimport probtorch\n\nclass Encoder(nn.Module):\n    def __init__(self, num_pixels=784, num_hidden=50, num_digits=10, num_style=2):\n        super(self.__class__, self).__init__()\n        self.h = nn.Sequential(\n                    nn.Linear(num_pixels, num_hidden),\n                    nn.ReLU())\n        self.y_log_weights = nn.Linear(num_hidden, num_digits)\n        self.z_mean = nn.Linear(num_hidden + num_digits, num_style)\n        self.z_log_std = nn.Linear(num_hidden + num_digits, num_style)\n\n    def forward(self, x, y_values=None, num_samples=10):\n        q = probtorch.Trace()\n        x = x.expand(num_samples, *x.size())\n        if y_values is not None:\n            y_values = y_values.expand(num_samples, *y_values.size())\n        h = self.h(x)\n        y = q.concrete(logits=self.y_log_weights(h), temperature=0.66,\n                       value=y_values, name='y')\n        h2 = torch.cat([y, h], -1)\n        z = q.normal(loc=self.z_mean(h2),\n                     scale=torch.exp(self.z_log_std(h2)),\n                     name='z')\n        return q\n```\n\nIn the code above, the method `q.concrete` samples or observes from a Concrete/Gumbel-Softmax relaxation of the discrete distribution, depending on whether supervision values `y_values` are provided. The method `q.normal` samples from a univariate normal.\n\nThe resulting trace `q` now contains two entries `q['y']` and `q['z']`, which are instances of a `RandomVariable` class, which stores both the value and the log probability associated with the variable. The stored values are now used to condition execution of the decoder model:\n```python\n\ndef binary_cross_entropy(x_mean, x, EPS=1e-9):\n    return - (torch.log(x_mean + EPS) * x +\n              torch.log(1 - x_mean + EPS) * (1 - x)).sum(-1)\n\nclass Decoder(nn.Module):\n    def __init__(self, num_pixels=784, num_hidden=50, num_digits=10, num_style=2):\n        super(self.__class__, self).__init__()\n        self.num_digits = num_digits\n        self.h = nn.Sequential(\n                   nn.Linear(num_style + num_digits, num_hidden),\n                   nn.ReLU())\n        self.x_mean = nn.Sequential(\n                        nn.Linear(num_hidden, num_pixels),\n                        nn.Sigmoid())\n\n    def forward(self, x, q=None):\n        if q is None:\n            q = probtorch.Trace()\n        p = probtorch.Trace()\n        y = p.concrete(logits=torch.zeros(x.size(0), self.num_digits),\n                       temperature=0.66,\n                       value=q['y'], name='y')\n        z = p.normal(loc=0.0, scale=1.0, value=q['z'], name='z')\n        h = self.h(torch.cat([y, z], -1))\n        p.loss(binary_cross_entropy, self.x_mean(h), x, name='x')\n        return p\n```\nThe model above can be used both for conditioned forward execution, but also for generation. The reason for this is that `q[k]` returns `None` for variable names `k` that have not been instantiated.\n\nTo train the model components above, probabilistic Torch provides objectives that compute an estimate of a lower bound on the log marginal likelihood, which can now be maximized with standard PyTorch optimizers\n```python\nfrom probtorch.objectives.montecarlo import elbo\nfrom random import rand\n# initialize model and optimizer\nenc = Encoder()\ndec = Decoder()\noptimizer =  torch.optim.Adam(list(enc.parameters())\n                              + list(dec.parameters()))\n# define subset of batches that will be supervised\nsupervise = [rand() \u003c 0.01 for _ in data]\n# train model for 10 epochs\nfor epoch in range(10):\n    for b, (x, y) in data:\n        x = Variable(x)\n        if supervise[b]:\n            y = Variable(y)\n            q = enc(x, y)\n        else:\n            q = enc(x)\n        p = dec(x, q)\n        loss = -elbo(q, p, sample_dim=0, batch_dim=1)\n        loss.backward()\n        optimizer.step()\n```\n\nFor a more details, see the Jupyter notebooks in the `examples/` subdirectory.\n\n# References\n\n[1] Kingma, Diederik P, Danilo J Rezende, Shakir Mohamed, and Max Welling. 2014. “Semi-Supervised Learning with Deep Generative Models.” http://arxiv.org/abs/1406.5298.\n","funding_links":[],"categories":["Pytorch \u0026 related libraries｜Pytorch \u0026 相关库","Python","Pytorch \u0026 related libraries"],"sub_categories":["Probabilistic/Generative Libraries｜概率库和生成库:","Probabilistic/Generative Libraries:"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fprobtorch%2Fprobtorch","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fprobtorch%2Fprobtorch","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fprobtorch%2Fprobtorch/lists"}