{"id":13415520,"url":"https://github.com/krasserm/bayesian-machine-learning","last_synced_at":"2025-05-15T13:06:44.769Z","repository":{"id":47387243,"uuid":"125869131","full_name":"krasserm/bayesian-machine-learning","owner":"krasserm","description":"Notebooks about Bayesian methods for machine learning","archived":false,"fork":false,"pushed_at":"2024-03-06T17:26:46.000Z","size":28484,"stargazers_count":1855,"open_issues_count":5,"forks_count":470,"subscribers_count":76,"default_branch":"dev","last_synced_at":"2025-04-22T03:30:46.275Z","etag":null,"topics":["bayesian-machine-learning","bayesian-methods","bayesian-optimization","gaussian-processes","machine-learning","variational-autoencoder"],"latest_commit_sha":null,"homepage":"","language":"Jupyter Notebook","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/krasserm.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2018-03-19T14:17:43.000Z","updated_at":"2025-04-21T06:48:00.000Z","dependencies_parsed_at":"2024-10-22T22:26:36.573Z","dependency_job_id":null,"html_url":"https://github.com/krasserm/bayesian-machine-learning","commit_stats":null,"previous_names":[],"tags_count":4,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/krasserm%2Fbayesian-machine-learning","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/krasserm%2Fbayesian-machine-learning/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/krasserm%2Fbayesian-machine-learning/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/krasserm%2Fbayesian-machine-learning/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/krasserm","download_url":"https://codeload.github.com/krasserm/bayesian-machine-learning/tar.gz/refs/heads/dev","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":254346624,"owners_count":22055808,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["bayesian-machine-learning","bayesian-methods","bayesian-optimization","gaussian-processes","machine-learning","variational-autoencoder"],"created_at":"2024-07-30T21:00:49.991Z","updated_at":"2025-05-15T13:06:39.752Z","avatar_url":"https://github.com/krasserm.png","language":"Jupyter Notebook","readme":"## Bayesian machine learning notebooks\n\n[![DOI](https://zenodo.org/badge/DOI/10.5281/zenodo.4318528.svg)](https://doi.org/10.5281/zenodo.4318528)\n\nThis repository is a collection of notebooks about *Bayesian Machine Learning*. The following links display \nsome of the notebooks via [nbviewer](https://nbviewer.jupyter.org/) to ensure a proper rendering of formulas.\nDependencies are specified in `requirements.txt` files in subdirectories.\n\n- [Bayesian regression with linear basis function models](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-linear-regression/bayesian_linear_regression.ipynb). \n  Introduction to Bayesian linear regression. Implementation with plain NumPy and scikit-learn. See also \n  [PyMC3 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-linear-regression/bayesian_linear_regression_pymc3.ipynb).\n\n- [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/dev/gaussian-processes/gaussian_processes.ipynb)\n  [Gaussian processes](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/gaussian-processes/gaussian_processes.ipynb?flush_cache=true). \n  Introduction to Gaussian processes for regression. Implementation with plain NumPy/SciPy as well as with scikit-learn and GPy. \n\n- [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/dev/gaussian-processes/gaussian_processes_classification.ipynb)\n  [Gaussian processes for classification](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/gaussian-processes/gaussian_processes_classification.ipynb). \n  Introduction to Gaussian processes for classification. Implementation with plain NumPy/SciPy as well as with scikit-learn. \n\n- [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/dev/gaussian-processes/gaussian_processes_sparse.ipynb)\n  [Sparse Gaussian processes](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/gaussian-processes/gaussian_processes_sparse.ipynb). \n  Introduction to sparse Gaussian processes using a variational approach. Example implementation with JAX. \n\n- [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-optimization/bayesian_optimization.ipynb)\n  [Bayesian optimization](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-optimization/bayesian_optimization.ipynb). \n  Introduction to Bayesian optimization. Implementation with plain NumPy/SciPy as well as with libraries scikit-optimize \n  and GPyOpt. Hyper-parameter tuning as application example.  \n\n- [Variational inference in Bayesian neural networks](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-neural-networks/bayesian_neural_networks.ipynb). \n  Demonstrates how to implement a Bayesian neural network and variational inference of weights. Example implementation \n  with Keras.\n\n- [Reliable uncertainty estimates for neural network predictions](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/noise-contrastive-priors/ncp.ipynb). \n  Uses noise contrastive priors for Bayesian neural networks to get more reliable uncertainty estimates for OOD data.\n  Implemented with Tensorflow 2 and Tensorflow Probability.\n\n- [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/dev/latent-variable-models/latent_variable_models_part_1.ipynb)\n  [Latent variable models, part 1: Gaussian mixture models and the EM algorithm](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/latent-variable-models/latent_variable_models_part_1.ipynb).\n  Introduction to the expectation maximization (EM) algorithm and its application to Gaussian mixture models. \n  Implementation with plain NumPy/SciPy and scikit-learn. See also \n  [PyMC3 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/latent-variable-models/latent_variable_models_part_1_pymc3.ipynb).\n\n- [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/dev/latent-variable-models/latent_variable_models_part_2.ipynb)\n  [Latent variable models, part 2: Stochastic variational inference and variational autoencoders](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/latent-variable-models/latent_variable_models_part_2.ipynb). \n  Introduction to stochastic variational inference with a variational autoencoder as application example. Implementation \n  with Tensorflow 2.x.\n\n- [Deep feature consistent variational autoencoder](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/autoencoder-applications/variational_autoencoder_dfc.ipynb). \n  Describes how a perceptual loss can improve the quality of images generated by a variational autoencoder. Example \n  implementation with Keras.  \n\n- [Conditional generation via Bayesian optimization in latent space](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/autoencoder-applications/variational_autoencoder_opt.ipynb). \n  Describes an approach for conditionally generating outputs with desired properties by doing Bayesian optimization in \n  latent space learned by a variational autoencoder. Example application implemented with Keras and GPyOpt.\n","funding_links":[],"categories":["Jupyter Notebook","A01_机器学习教程","Uncategorized","4.) Hyperparameter Optimization"],"sub_categories":["Uncategorized","**[Tutorials/Blogs]**"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fkrasserm%2Fbayesian-machine-learning","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fkrasserm%2Fbayesian-machine-learning","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fkrasserm%2Fbayesian-machine-learning/lists"}