{"id":33052240,"url":"https://github.com/PGM-Lab/InferPy","last_synced_at":"2025-11-16T16:02:04.260Z","repository":{"id":80659073,"uuid":"107267243","full_name":"PGM-Lab/InferPy","owner":"PGM-Lab","description":"InferPy: Deep Probabilistic Modeling with Tensorflow Made Easy","archived":false,"fork":false,"pushed_at":"2024-08-02T16:21:02.000Z","size":32554,"stargazers_count":148,"open_issues_count":65,"forks_count":14,"subscribers_count":9,"default_branch":"master","last_synced_at":"2025-09-03T21:24:34.213Z","etag":null,"topics":["keras-tensorflow","neural-network","probabilistic-modeling","probabilistic-programming","tensorflow","user-friendly"],"latest_commit_sha":null,"homepage":"https://inferpy-docs.readthedocs.io/en/stable/index.html ","language":"Jupyter Notebook","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/PGM-Lab.png","metadata":{"files":{"readme":"README.rst","changelog":"CHANGELOG.md","contributing":null,"funding":null,"license":"LICENSE.txt","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null}},"created_at":"2017-10-17T12:44:12.000Z","updated_at":"2024-12-19T11:38:04.000Z","dependencies_parsed_at":null,"dependency_job_id":"c59cd02e-620f-47d2-aef9-a0648dbdc4d9","html_url":"https://github.com/PGM-Lab/InferPy","commit_stats":null,"previous_names":["pgmlabspain/inferpy"],"tags_count":22,"template":false,"template_full_name":null,"purl":"pkg:github/PGM-Lab/InferPy","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/PGM-Lab%2FInferPy","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/PGM-Lab%2FInferPy/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/PGM-Lab%2FInferPy/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/PGM-Lab%2FInferPy/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/PGM-Lab","download_url":"https://codeload.github.com/PGM-Lab/InferPy/tar.gz/refs/heads/master","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/PGM-Lab%2FInferPy/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":284734138,"owners_count":27054622,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","status":"online","status_checked_at":"2025-11-16T02:00:05.974Z","response_time":65,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["keras-tensorflow","neural-network","probabilistic-modeling","probabilistic-programming","tensorflow","user-friendly"],"created_at":"2025-11-14T03:00:31.653Z","updated_at":"2025-11-16T16:02:04.255Z","avatar_url":"https://github.com/PGM-Lab.png","language":"Jupyter Notebook","readme":"\n.. image:: https://badge.fury.io/gh/PGM-Lab%2Finferpy.svg\n    :target: https://badge.fury.io/gh/PGM-Lab%2Finferpy\n\n.. image:: https://travis-ci.org/PGM-Lab/InferPy.svg?branch=master\n    :target: https://travis-ci.org/PGM-Lab/InferPy\n\n.. image:: https://img.shields.io/badge/License-Apache%202.0-blue.svg\n    :target: https://opensource.org/licenses/Apache-2.0\n\n\n\n\n\n\n.. image:: docs/_static/img/logo.png\n   \t:scale: 90 %\n   \t:align: center\n\nDocumentation --\u003e (https://inferpy-docs.readthedocs.io/en/stable/index.html)\n\nInferPy: Deep Probabilistic Modeling Made Easy\n===============================================\n\n\nInferPy is a high-level API for probabilistic modeling written in Python and \ncapable of running on top of Edward and Tensorflow. InferPy's API is \nstrongly inspired by Keras and it has a focus on enabling flexible data processing, \neasy-to-code probablistic modeling, scalable inference and robust model validation.\n\nUse InferPy is you need a probabilistic programming language that:\n\n* Allows easy and fast prototyping of hierarchical probabilistic models with a simple and user friendly API inspired by Keras. \n* Defines probabilistic models with complex probabilistic constructs containing deep neural networks.   \n* Automatically creates computational efficient batched models without the need to deal with complex tensor operations.\n* Run seamlessly on CPU and GPU by relying on Tensorflow. \n\n.. * Process seamlessly small data sets stored on a Panda's data-frame, or large distributed data sets by relying on Apache Spark.\n\nInferPy is to Edward what Keras is to Tensorflow\n-------------------------------------------------\nInferPy's aim is to be to Edward what Keras is to Tensorflow. Edward is a general purpose\nprobabilistic programing language, like Tensorflow is a general computational engine. \nBut this generality comes a at price. Edward's API is\nverbose and is based on distributions over Tensor objects, which are n-dimensional arrays with \ncomplex semantics operations. Probability distributions over Tensors are powerful abstractions \nbut it is not easy to operate with them. InferPy's API is no so general like Edward's API \nbut still covers a wide range of powerful and widely used probabilistic models, which can contain\ncomplex probability constructs containing deep neural networks.  \n\n\n\n\nGetting Started:\n================\n\nInstallation\n-----------------\n\nInstall InferPy from PyPI:\n\n.. code:: bash\n\n   $ python -m pip install inferpy\n\n\n\n\n\n30 seconds to InferPy\n--------------------------\n\nThe core data structures of InferPy is a **probabilistic model**,\ndefined as a set of **random variables** with a conditional dependency\nstructure. A **random varible** is an object\nparameterized by a set of tensors.\n\nLet's look at a simple non-linear **probabilistic component analysis** model (NLPCA). Graphically the model can\nbe defined as follows,\n\n.. figure:: docs/_static/img/nlpca.png\n   :alt: Non-linear PCA\n   :scale: 60 %\n   :align: center\n\n   Non-linear PCA\n\nWe start by importing the required packages and defining the constant parameters in the model.\n\n\n.. code-block:: python\n\n    import inferpy as inf\n    import tensorflow as tf\n\n    # number of components\n    k = 1\n    # size of the hidden layer in the NN\n    d0 = 100\n    # dimensionality of the data\n    dx = 2\n    # number of observations (dataset size)\n    N = 1000\n\nA model can be defined by decorating any function with ``@inf.probmodel``. The model is fully specified by\nthe variables defined inside this function:\n\n\n.. code-block:: python\n\n    @inf.probmodel\n    def nlpca(k, d0, dx, decoder):\n\n        with inf.datamodel():\n            z = inf.Normal(tf.ones([k])*0.5, 1, name=\"z\")    # shape = [N,k]\n            output = decoder(z,d0,dx)\n            x_loc = output[:,:dx]\n            x_scale = tf.nn.softmax(output[:,dx:])\n            x = inf.Normal(x_loc, x_scale, name=\"x\")   # shape = [N,d]\n\nThe construct ``with inf.datamodel()``, which resembles to the **plateau notation**, will replicate\nN times the variables enclosed, where N is the size of our data.\n\n\nIn the previous model, the input argument ``decoder`` must be a function implementing a neural network.\nThis might be defined outside the model as follows.\n\n\n\n\n.. code-block:: python\n\n    def decoder(z,d0,dx):\n        h0 = tf.layers.dense(z, d0, tf.nn.relu)\n        return tf.layers.dense(h0, 2 * dx)\n\nNow, we can instantiate our model and obtain samples (from the prior distributions).\n\n\n\n.. code-block:: python\n\n\n    # create an instance of the model\n    m = nlpca(k,d0,dx, decoder)\n\n    # Sample from priors\n    samples = m.prior().sample()\nIn variational inference, we must defined a Q-model as follows.\n\n\n\n\n.. code-block:: python\n\n    @inf.probmodel\n    def qmodel(k):\n        with inf.datamodel():\n            qz_loc = inf.Parameter(tf.ones([k])*0.5, name=\"qz_loc\")\n            qz_scale = tf.math.softplus(inf.Parameter(tf.ones([k]),name=\"qz_scale\"))\n\n            qz = inf.Normal(qz_loc, qz_scale, name=\"z\")\n\nAfterwards, we define the parameters of our inference algorithm and fit the data to the model.\n\n\n\n\n.. code-block:: python\n\n    # set the inference algorithm\n    VI = inf.inference.VI(qmodel(k), epochs=5000)\n\n    # learn the parameters\n    m.fit({\"x\": x_train}, VI)\n\nThe inference method can be further configured. But, as in Keras, a core\nprinciple is to try make things reasonably simple, while allowing the\nuser the full control if needed.\n\n\n\nFinally, we might extract the posterior of ``z``, which is basically the hidden representation\nof our data.\n\n\n\n.. code-block:: python\n\n   #extract the hidden representation\n   hidden_encoding = m.posterior(\"z\", data={\"x\":x_train})\n   print(hidden_encoding.sample())\n","funding_links":[],"categories":["Probabilistic Methods"],"sub_categories":["NLP","Others"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FPGM-Lab%2FInferPy","html_url":"https://awesome.ecosyste.ms/projects/github.com%2FPGM-Lab%2FInferPy","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FPGM-Lab%2FInferPy/lists"}