{"id":13408173,"url":"https://github.com/js05212/BayesianDeepLearning-Survey","last_synced_at":"2025-03-14T12:32:14.767Z","repository":{"id":37779074,"uuid":"197287995","full_name":"js05212/BayesianDeepLearning-Survey","owner":"js05212","description":"Bayesian Deep Learning: A Survey","archived":false,"fork":false,"pushed_at":"2024-11-11T07:16:54.000Z","size":465,"stargazers_count":510,"open_issues_count":0,"forks_count":62,"subscribers_count":28,"default_branch":"master","last_synced_at":"2025-03-05T18:51:38.880Z","etag":null,"topics":["arxiv","bayesian","bayesian-deep-learning","bdl","deep-learning","machine-learning","neural-networks","survey","variational-autoencoders"],"latest_commit_sha":null,"homepage":"","language":null,"has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/js05212.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2019-07-17T00:51:13.000Z","updated_at":"2025-02-25T11:01:55.000Z","dependencies_parsed_at":"2023-01-29T23:46:01.104Z","dependency_job_id":"d30e9f1b-ce62-4b06-92e9-67216137de00","html_url":"https://github.com/js05212/BayesianDeepLearning-Survey","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/js05212%2FBayesianDeepLearning-Survey","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/js05212%2FBayesianDeepLearning-Survey/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/js05212%2FBayesianDeepLearning-Survey/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/js05212%2FBayesianDeepLearning-Survey/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/js05212","download_url":"https://codeload.github.com/js05212/BayesianDeepLearning-Survey/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":243578550,"owners_count":20313848,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["arxiv","bayesian","bayesian-deep-learning","bdl","deep-learning","machine-learning","neural-networks","survey","variational-autoencoders"],"created_at":"2024-07-30T20:00:51.211Z","updated_at":"2025-03-14T12:32:14.748Z","avatar_url":"https://github.com/js05212.png","language":null,"readme":"# An Updating Survey for Bayesian Deep Learning (BDL)\n\nThis is an updating survey for Bayesian Deep Learning (BDL), an constantly updated and extended version for the manuscript, '[A Survey on Bayesian Deep Learning](http://wanghao.in/paper/CSUR20_BDL.pdf)', published in [**ACM Computing Surveys**](https://dl.acm.org/doi/10.1145/3409383) 2020.\u003cbr\u003e\n\nBayesian deep learning is a powerful framework for designing models across a wide range of applications. See our [**Nature Medicine** paper](https://www.nature.com/articles/s41591-021-01273-1.pdf) for a possible application on healthcare. \n\n## Contents\n\n* [Survey](https://github.com/js05212/BayesianDeepLearning-Survey/blob/master/README.md#survey)\n* [BDL and Recommender Systems](https://github.com/js05212/BayesianDeepLearning-Survey/blob/master/README.md#bdl-and-recommender-systems)\n* [BDL and Domain Adaptation (and Domain Generalization, Meta Learning, etc.)](https://github.com/js05212/BayesianDeepLearning-Survey/blob/master/README.md#bdl-and-domain-adaptation-and-domain-generalization-meta-learning-etc)\n* [BDL and Healthcare](https://github.com/js05212/BayesianDeepLearning-Survey/blob/master/README.md#bdl-and-healthcare)\n* [BDL and Natural Language Processing (NLP)](https://github.com/js05212/BayesianDeepLearning-Survey/blob/master/README.md#bdl-and-nlp)\n* [BDL and Computer Vision (CV)](https://github.com/js05212/BayesianDeepLearning-Survey/blob/master/README.md#bdl-and-computer-vision)\n* [BDL and Control/Planning](https://github.com/js05212/BayesianDeepLearning-Survey/blob/master/README.md#bdl-and-controlplanning)\n* [BDL and Graphs (Link Prediction, Graph Neural Networks, Knowledge Graphs, etc.)](https://github.com/js05212/BayesianDeepLearning-Survey/blob/master/README.md#bdl-and-graphs-link-prediction-graph-neural-networks-knowledge-graphs-etc)\n* [BDL and Topic Modeling](https://github.com/js05212/BayesianDeepLearning-Survey/blob/master/README.md#bdl-and-topic-modeling)\n* [BDL and Speech Recognition/Synthesis](https://github.com/js05212/BayesianDeepLearning-Survey/blob/master/README.md#bdl-and-speech-recognitionsynthesis)\n* [BDL and Forecasting (Time Series Analysis)](https://github.com/js05212/BayesianDeepLearning-Survey/blob/master/README.md#bdl-and-forecasting-time-series-analysis)\n* [BDL and Distributed/Federated Learning](https://github.com/js05212/BayesianDeepLearning-Survey/blob/master/README.md#bdl-and-distributedfederated-learning)\n* [BDL and Continual/Life-Long Learning](https://github.com/js05212/BayesianDeepLearning-Survey/blob/master/README.md#bdl-and-continuallife-long-learning)\n* [BDL and AI4Science](https://github.com/js05212/BayesianDeepLearning-Survey/blob/master/README.md#bdl-and-ai4science)\n* [BDL as a Framework (Miscellaneous)](https://github.com/js05212/BayesianDeepLearning-Survey/blob/master/README.md#bdl-as-a-framework-miscellaneous)\n* [Bayesian/Probabilistic Neural Networks as Building Blocks of BDL](https://github.com/js05212/BayesianDeepLearning-Survey/blob/master/README.md#bayesianprobabilistic-neural-networks-as-building-blocks-of-bdl)\n\n\n## Survey\n\nA Survey on Bayesian Deep Learning\u003cbr\u003e\nby Wang et al., ACM Computing Surveys (CSUR) 2020\u003cbr\u003e\n[[PDF]](http://wanghao.in/paper/CSUR20_BDL.pdf) [[Blog]](http://wanghao.in/BDL.html) [[BDL Framework in 2016]](http://wanghao.in/paper/TKDE16_BDL.pdf)\n\n\u003cp align=\"center\"\u003e\n\u003cimg src=\"./BDL_Table.png\" alt=\"\" data-canonical-src=\"./BDL_Table.png\" width=\"930\" height=\"580\"/\u003e\n\u003c/p\u003e\n\n## BDL and Recommender Systems\n\nCollaborative Deep Learning for Recommender Systems\u003cbr\u003e\nby Wang et al., KDD 2015\u003cbr\u003e\n[[PDF]](http://wanghao.in/paper/KDD15_CDL.pdf) [[Project Page]](http://wanghao.in/CDL.htm) [[2014 Arxiv Version]](https://arxiv.org/abs/1409.2944) [[Code]](https://github.com/js05212/CDL) [[MXNet Code]](https://github.com/js05212/MXNet-for-CDL) [[TensorFlow Code]](https://github.com/js05212/CollaborativeDeepLearning-TensorFlow) [[Dataset A]](https://github.com/js05212/citeulike-a) [[Dataset B]](https://github.com/js05212/citeulike-t) [[Jupyter Notebook]](https://github.com/js05212/MXNet-for-CDL/blob/master/collaborative-dl.ipynb) [[Slides]](http://wanghao.in/slides/CDL_slides.pdf) [[Slides (Long)]](http://wanghao.in/slides/CDL_slides_long.pdf)\n\nCollaborative Recurrent Autoencoder: Recommend while Learning to Fill in the Blanks\u003cbr\u003e\nby Wang et al., NIPS 2016\u003cbr\u003e\n[[PDF]](https://arxiv.org/abs/1611.00454)\n\nCollaborative Knowledge Base Embedding for Recommender Systems\u003cbr\u003e\nby Zhang et al., KDD 2016\u003cbr\u003e\n[[PDF]](https://dl.acm.org/citation.cfm?id=2939673)\n\nCollaborative Deep Ranking: A Hybrid Pair-Wise Recommendation Algorithm with Implicit Feedback\u003cbr\u003e\nby Ying et al., PAKDD 2016\u003cbr\u003e\n[[PDF]](https://link.springer.com/chapter/10.1007/978-3-319-31750-2_44)\n\nCollaborative Variational Autoencoder for Recommender Systems\u003cbr\u003e\nby Li et al., KDD 2017\u003cbr\u003e\n[[PDF]](https://www.kdd.org/kdd2017/papers/view/collaborative-variational-autoencoder-for-recommender-systems)\n\nVariational Autoencoders for Collaborative Filtering\u003cbr\u003e\nby Liang et al., WWW 2018\u003cbr\u003e\n[[PDF]](https://arxiv.org/abs/1802.05814)\n\nProbabilistic Metric Learning with Adaptive Margin for Top-K Recommendation\u003cbr\u003e\nby Ma et al., KDD 2020\u003cbr\u003e\n[[PDF]](https://dl.acm.org/doi/pdf/10.1145/3394486.3403147)\n\n## BDL and Domain Adaptation (and Domain Generalization, Meta Learning, etc.)\nProbabilistic Model-Agnostic Meta-Learning\u003cbr\u003e\nby Finn et al., NIPS 2018\u003cbr\u003e\n[[PDF]](https://papers.nips.cc/paper/2018/file/8e2c381d4dd04f1c55093f22c59c3a08-Paper.pdf)\n\nBayesian Model-Agnostic Meta-Learning\u003cbr\u003e\nby Yoon et al., NIPS 2018\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/1806.03836.pdf)\n\nRecasting Gradient-Based Meta-Learning as Hierarchical Bayes\u003cbr\u003e\nby Grant et al., ICLR 2018\u003cbr\u003e\n[[PDF]](https://arxiv.org/abs/1801.08930)\n\nReconciling Meta-Learning and Continual Learning with Online Mixtures of Tasks\u003cbr\u003e\nby Jerfal et al., NIPS 2019\u003cbr\u003e\n[[PDF]](https://arxiv.org/abs/1812.06080)\n\nMeta-Learning Probabilistic Inference For Prediction\u003cbr\u003e\nby Gordon et al., ICLR 2019\u003cbr\u003e\n[[PDF]](https://arxiv.org/abs/1805.09921)\n\nLearning to Learn with Variational Information Bottleneck for Domain Generalization\u003cbr\u003e\nby Du et al., ECCV 2020\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/2007.07645.pdf)\n\nBayesian Meta-Learning for the Few-Shot Setting via Deep Kernels\u003cbr\u003e\nby Patacchiola et al., NIPS 2020\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/1910.05199.pdf)\n\nContinuously Indexed Domain Adaptation\u003cbr\u003e\nby Wang et al., ICML 2020\u003cbr\u003e\n[[PDF]](http://wanghao.in/paper/ICML20_CIDA.pdf) \n\nA Bit More Bayesian: Domain-Invariant Learning with Uncertainty\u003cbr\u003e\nby Xiao et al., ICML 2021\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/2105.04030.pdf)\n\nDomain-Indexing Variational Bayes: Interpretable Domain Index for Domain Adaptation\u003cbr\u003e\nby Xu et al., ICLR 2023\u003cbr\u003e\n[[PDF]](http://wanghao.in/paper/ICLR23_VDI.pdf)\n\n\n\n\n## BDL and Healthcare\n\nElectronic Health Record Analysis via Deep Poisson Factor Models\u003cbr\u003e\nby Henao et al., JMLR 2016\u003cbr\u003e\n[[PDF]](http://www.jmlr.org/papers/volume17/15-429/15-429.pdf)\n\nStructured Inference Networks for Nonlinear State Space Models\u003cbr\u003e\nby Krishnan et al., AAAI 2017\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/1609.09869.pdf)\n\nCausal Effect Inference with Deep Latent-Variable Models\u003cbr\u003e\nby Louizos et al., NIPS 2017\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/1705.08821.pdf)\n\nBlack Box FDR\u003cbr\u003e\nby Tansey et al., ICML 2018\u003cbr\u003e\n[[PDF]](https://arxiv.org/abs/1806.03143)\n\nBidirectional Inference Networks: A Class of Deep Bayesian Networks for Health Profiling\u003cbr\u003e\nby Wang et al., AAAI 2019\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/1902.02037)\n\nSampling-free Uncertainty Estimation in Gated Recurrent Units with Applications to Normative Modeling in Neuroimaging\u003cbr\u003e\nby Hwang et al., UAI 2019\u003cbr\u003e\n[[PDF]](http://auai.org/uai2019/proceedings/papers/296.pdf)\n\nNeural Jump Stochastic Differential Equations\u003cbr\u003e\nby Jia et al., NIPS 2019\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/1905.10403.pdf)\n\nTowards Interpretable Clinical Diagnosis with Bayesian Network Ensembles Stacked on Entity-Aware CNNs\u003cbr\u003e\nby Chen et al., ACL 2020\u003cbr\u003e\n[[PDF]](https://www.aclweb.org/anthology/2020.acl-main.286.pdf)\n\nContinuously Indexed Domain Adaptation\u003cbr\u003e\nby Wang et al., ICML 2020\u003cbr\u003e\n[[PDF]](http://wanghao.in/paper/ICML20_CIDA.pdf) [Cross Referenced in [BDL and Domain Adaptation](https://github.com/js05212/BayesianDeepLearning-Survey/blob/master/README.md#bdl-and-domain-adaptation-and-domain-generalization-meta-learning-etc)]\n\nAssessment of medication self-administration using artificial intelligence\u003cbr\u003e\nby Zhao et al., Nature Medicine 2021\u003cbr\u003e\n[[PDF]](https://www.nature.com/articles/s41591-021-01273-1.pdf)\n\nNeural Pharmacodynamic State Space Modeling\u003cbr\u003e\nby Hussain et al., ICML 2021\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/2102.11218.pdf)\n\nSelf-Interpretable Time Series Prediction with Counterfactual Explanations\u003cbr\u003e\nby Yan et al., ICML 2023\u003cbr\u003e\n[[PDF]](http://wanghao.in/paper/ICML23_CounTS.pdf) [Cross Referenced in [BDL and Forecasting (Time Series Analysis)](https://github.com/js05212/BayesianDeepLearning-Survey/blob/master/README.md#bdl-and-forecasting-time-series-analysis)]\n\n## BDL and NLP\n\nSequence to Better Sequence: Continuous Revision of Combinatorial Structures\u003cbr\u003e\nby Mueller et al., ICML 2017\u003cbr\u003e\n[[PDF]](http://proceedings.mlr.press/v70/mueller17a.html)\n\nQuaSE: Sequence Editing under Quantifiable Guidance\u003cbr\u003e\nby Liao et al., EMNLP 2018\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/1804.07007.pdf)\n\nDispersed Exponential Family Mixture VAEs for Interpretable Text Generation\u003cbr\u003e\nby Shi et al., ICML 2020\u003cbr\u003e\n[[PDF]](https://proceedings.icml.cc/static/paper_files/icml/2020/3242-Paper.pdf)\n\nTowards Interpretable Clinical Diagnosis with Bayesian Network Ensembles Stacked on Entity-Aware CNNs\u003cbr\u003e\nby Chen et al., ACL 2020\u003cbr\u003e\n[[PDF]](https://www.aclweb.org/anthology/2020.acl-main.286.pdf) [Cross Referenced in [BDL and Healthcare](https://github.com/js05212/BayesianDeepLearning-Survey/blob/master/README.md#bdl-and-healthcare)]\n\nWhat You Say and How You Say it: Joint Modeling of Topics and Discourse in Microblog Conversations\u003cbr\u003e\nby Zeng et al., ACL 2020\u003cbr\u003e\n[[PDF]](https://aclanthology.org/Q19-1017.pdf)\n\nLatent Diffusion Energy-Based Model for Interpretable Text Modeling\u003cbr\u003e\nby Yu et al., ICML 2022\u003cbr\u003e\n[[PDF]](https://arxiv.org/abs/2206.05895)\n\nDiffusion-LM Improves Controllable Text Generation\u003cbr\u003e\nby Li et al., NeurIPS 2022\u003cbr\u003e\n[[PDF]](https://proceedings.neurips.cc/paper_files/paper/2022/file/1be5bc25d50895ee656b8c2d9eb89d6a-Paper-Conference.pdf)\n\nTractable Control for Autoregressive Language Generation\u003cbr\u003e\nby Zhang et al., ICML 2023\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/2304.07438.pdf)\n\nVariational Language Concepts for Interpreting Foundation Language Models\u003cbr\u003e\nby Wang et al., EMNLP 2024\u003cbr\u003e\n[[PDF]](http://wanghao.in/paper/EMNLP24_VALC.pdf)\n\n## BDL and Computer Vision\nAttend, Infer, Repeat: Fast Scene Understanding with Generative Models\u003cbr\u003e\nby Eslami et al., NIPS 2016\u003cbr\u003e\n[[PDF]](https://arxiv.org/abs/1603.08575)\n\nEfficient Inference in Occlusion-aware Generative Models of Images\u003cbr\u003e\nby Huang et al., ICLR 2016\u003cbr\u003e\n[[PDF]](https://arxiv.org/abs/1511.06362)\n\nSequential Attend, Infer, Repeat: Generative Modelling of Moving Objects\u003cbr\u003e\nby Kosiorek et al., NIPS 2018\u003cbr\u003e\n[[PDF]](https://arxiv.org/abs/1806.01794)\n\nGaussian Process Prior Variational Autoencoders\u003cbr\u003e\nby Casale et al., NIPS 2018\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/1810.11738.pdf)\n\nSpatially Invariant Unsupervised Object Detection with Convolutional Neural Networks\u003cbr\u003e\nby Crawford et al., AAAI 2019\u003cbr\u003e\n[[PDF]](https://www.aaai.org/ojs/index.php/AAAI/article/view/4216)\n\nFaster Attend-Infer-Repeat with Tractable Probabilistic Models\u003cbr\u003e\nby Stelzner et al., ICML 2019\u003cbr\u003e\n[[PDF]](http://proceedings.mlr.press/v97/stelzner19a.html)\n\nAsynchronous Temporal Fields for Action Recognition\u003cbr\u003e\nby Sigurdsson et al., CVPR 2017\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/1612.06371.pdf)\n\nGeneralizing Eye Tracking with Bayesian Adversarial Learning\u003cbr\u003e\nby Wang et al., CVPR 2019\u003cbr\u003e\n[[PDF]](http://openaccess.thecvf.com/content_CVPR_2019/papers/Wang_Generalizing_Eye_Tracking_With_Bayesian_Adversarial_Learning_CVPR_2019_paper.pdf)\n\nSequential Neural Processes\u003cbr\u003e\nby Singh et al., NIPS 2019\u003cbr\u003e\n[[PDF]](http://papers.nips.cc/paper/9214-sequential-neural-processes.pdf)\n\nSPACE: Unsupervised Object-Oriented Scene Representation via Spatial Attention and Decomposition\u003cbr\u003e\nby Lin et al., ICLR 2020\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/2001.02407.pdf)\n\n Being Bayesian about Categorical Probability\u003cbr\u003e\n by Joo et al., ICML 2020\u003cbr\u003e\n [[PDF]](https://proceedings.icml.cc/static/paper_files/icml/2020/3560-Paper.pdf)\n\n NVAE: A Deep Hierarchical Variational Autoencoder\u003cbr\u003e\n by Vahdat et al., NIPS 2020\u003cbr\u003e\n [[PDF]](https://arxiv.org/abs/2007.03898)\n \n Learning Latent Space Energy-Based Prior Model\u003cbr\u003e\n by Pang et al., NIPS 2020\u003cbr\u003e\n [[PDF]](https://arxiv.org/pdf/2006.08205.pdf)\n \n Generative Neurosymbolic Machines\u003cbr\u003e\n by Jiang et al., NIPS 2020\u003cbr\u003e\n [[PDF]](https://arxiv.org/pdf/2010.12152.pdf)\n \n Denoising Diffusion Probabilistic Models\u003cbr\u003e\n by Ho et al., NIPS 2020\u003cbr\u003e\n [[PDF]](https://arxiv.org/pdf/2006.11239.pdf)\n \nA Causal View of Compositional Zero-shot Recognition\u003cbr\u003e\nby Atzmon et al., NIPS 2020\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/2006.14610.pdf)\n\nCounterfactuals Uncover the Modular Structure of Deep Generative Models\u003cbr\u003e\nby Besserve et al., ICLR 2020\u003cbr\u003e\n[[PDF]](https://openreview.net/pdf?id=SJxDDpEKvH)\n\nROOTS: Object-Centric Representation and Rendering of 3D Scenes\u003cbr\u003e\nby Chen et al., JMLR 2021\u003cbr\u003e\n[[PDF]](https://jmlr.csail.mit.edu/papers/volume22/20-1176/20-1176.pdf)\n \nImproved Denoising Diffusion Probabilistic Models\u003cbr\u003e\nby Nichol et al., ICML 2021\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/2102.09672.pdf)\n \nGenerative Interventions for Causal Learning.\u003cbr\u003e\nby Mao et al., CVPR 2021\u003cbr\u003e\n[[PDF]](http://wanghao.in/paper/CVPR21_GenInt.pdf)\n\nAdversarial Attacks are Reversible with Natural Supervision\u003cbr\u003e\nby Mao et al., ICCV 2021\u003cbr\u003e\n[[PDF]](http://www.wanghao.in/paper/ICCV21_ReverseAttack.pdf)\n\nCounterfactual Zero-Shot and Open-Set Visual Recognition\u003cbr\u003e\nby Yue et al., CVPR 2021\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/2103.00887.pdf)\n\nILVR: Conditioning Method for Denoising Diffusion Probabilistic Models\u003cbr\u003e\nby Choi et al., ICCV 2021\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/2108.02938.pdf)\n\nDiffusion Models Beat GANs on Image Synthesis\u003cbr\u003e\nby Dhariwal et al., NIPS 2021\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/2105.05233.pdf)\n\nDiffusion Visual Counterfactual Explanations\u003cbr\u003e\nby Augustin et al., NIPS 2022\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/2210.11841)\n\nDiffuseVAE: Efficient, Controllable and High-Fidelity Generation from Low-Dimensional Latents\u003cbr\u003e\nby Pandey et al., TMLR 2022\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/2201.00308)\n\nDiffusion Causal Models for Counterfactual Estimation\u003cbr\u003e\nby Sanchez et al., CleaR 2022\u003cbr\u003e\n[[PDF]](https://arxiv.org/abs/2202.10166)\n\nRelational Learning with Variational Bayes\u003cbr\u003e\nby Liu, ICLR 2022\u003cbr\u003e\n[[PDF]](https://openreview.net/pdf?id=Az-7gJc6lpr)\n\nHigh-Resolution Image Synthesis with Latent Diffusion Models\u003cbr\u003e\nby Rombach et al., CVPR 2022\u003cbr\u003e\n[[PDF]](https://arxiv.org/abs/2112.10752)\n\nGLIDE: Towards Photorealistic Image Generation and Editing with Text-Guided Diffusion Models\u003cbr\u003e\nby Nichol et al., ICML 2022\u003cbr\u003e\n[[PDF]](https://proceedings.mlr.press/v162/nichol22a.html)\n\nDiffusion Models for Adversarial Purification\u003cbr\u003e\nby Nie et al., ICML 2022\u003cbr\u003e\n[[PDF]](https://proceedings.mlr.press/v162/nie22a.html)\n\nA Conditional Point Diffusion-Refinement Paradigm for 3D Point Cloud Completion\u003cbr\u003e\nby Lyu et al., ICLR 2022\u003cbr\u003e\n[[PDF]](https://iclr.cc/virtual/2022/poster/7026)\n\nLabel-Efficient Semantic Segmentation with Diffusion Models\u003cbr\u003e\nby Baranchuk et al., ICLR 2022\u003cbr\u003e\n[[PDF]](https://iclr.cc/virtual/2022/poster/6569)\n\nLearning Fast Samplers for Diffusion Models by Differentiating Through Sample Quality\u003cbr\u003e\nby Watson et al., ICLR 2022\u003cbr\u003e\n[[PDF]](https://openreview.net/pdf?id=VFBjuF8HEp)\n\nFlexible Diffusion Modeling of Long Videos\u003cbr\u003e\nby Harvey et al., NIPS 2022\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/2205.11495.pdf)\n\nProtoVAE: A Trustworthy Self-Explainable Prototypical Variational Model\u003cbr\u003e\nby Gautam et al., NIPS 2022\u003cbr\u003e\n[[PDF]](https://proceedings.neurips.cc/paper_files/paper/2022/file/722f3f9298a961d2639eadd3f14a2816-Paper-Conference.pdf)\n\nCausal Transportability for Visual Recognition\u003cbr\u003e\nby Mao et al., CVPR 2022\u003cbr\u003e\n[[PDF]](http://wanghao.in/paper/CVPR22_CausalTrans.pdf)\n\nPosterior Matching for Arbitrary Conditioning\u003cbr\u003e\nby Strauss et al., NIPS 2022\u003cbr\u003e\n[[PDF]](https://openreview.net/pdf?id=EFnI8Qc--jE)\n\nOn the Relationship between Variational Inference and Auto-Associative Memory\u003cbr\u003e\nby Annabi et al., NIPS 2022\u003cbr\u003e\n[[PDF]](https://openreview.net/pdf?id=uCBx_6Hc7cu)\n\nRobust Perception through Equivariance\u003cbr\u003e\nby Mao et al., ICML 2023\u003cbr\u003e\n[[PDF]](http://wanghao.in/paper/ICML23_RobustEquivariance.pdf)\n\nObject-Centric Slot Diffusion\u003cbr\u003e\nby Jiang et al. NeurIPS 2023\u003cbr\u003e\n[[PDF]](https://arxiv.org/abs/2303.10834)\n\nPreDiff: Precipitation Nowcasting with Latent Diffusion Models\u003cbr\u003e\nby Gao et al., NeurIPS 2023\u003cbr\u003e\n[[PDF]](https://arxiv.org/abs/2307.10422)\n\nDiffusion Posterior Sampling for Linear Inverse Problem Solving: A Filtering Perspective\u003cbr\u003e\nby Dou et al., ICLR 2024\u003cbr\u003e\n[[PDF]](https://openreview.net/forum?id=tplXNcHZs1)\n\nDirectly Denoising Diffusion Models\u003cbr\u003e\nby Zhang et al., ICML 2024\u003cbr\u003e\n[[PDF]](https://proceedings.mlr.press/v235/zhang24bl.html)\n\nCausal Representation Learning Made Identifiable by Grouping of Observational Variables\u003cbr\u003e\nby Morioka et al., ICML 2024\u003cbr\u003e\n[[PDF]](https://proceedings.mlr.press/v235/morioka24a.html)\n\nCounterfactual Image Editing\nby Pan et al., ICML 2024\u003cbr\u003e\n[[PDF]](https://proceedings.mlr.press/v235/pan24a.html)\n\nProbabilistic Conceptual Explainers: Towards Trustworthy Conceptual Explanations for Vision Foundation Models\u003cbr\u003e\nby Wang et al., ICML 2024\u003cbr\u003e\n[[PDF]](http://wanghao.in/paper/ICML24_PACE.pdf)\n\n\n\n\n## BDL and Control/Planning\n\nEmbed to Control: A Locally Linear Latent Dynamics Model for Control from Raw Images\u003cbr\u003e\nby Watter et al., NIPS 2015\u003cbr\u003e\n[[PDF]](https://arxiv.org/abs/1506.07365)\n\nDeep Variational Bayes Filters: Unsupervised Learning of State Space Models from Raw Data\u003cbr\u003e\nby Karl et al., ICLR 2017\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/1605.06432.pdf)\n\nProbabilistic Recurrent State-Space Models\u003cbr\u003e\nby Doerr et al., ICML 2018\u003cbr\u003e\n[[PDF]](http://proceedings.mlr.press/v80/doerr18a/doerr18a.pdf)\n\nDeep Reinforcement Learning in a Handful of Trials using Probabilistic Dynamics Models\u003cbr\u003e\nby Chua et al., NIPS 2018\u003cbr\u003e\n[[PDF]](https://proceedings.neurips.cc/paper/2018/file/3de568f8597b94bda53149c7d7f5958c-Paper.pdf)\n\nRobust Locally-Linear Controllable Embedding\u003cbr\u003e\nby Banijamali et al., AISTATS 2018\u003cbr\u003e\n[[PDF]](http://proceedings.mlr.press/v84/banijamali18a/banijamali18a.pdf)\n\nLearning Latent Dynamics for Planning from Pixels\u003cbr\u003e\nby Hafner et al., ICML 2019\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/1811.04551.pdf)\n\nPlanning with Diffusion for Flexible Behavior Synthesis\u003cbr\u003e\nby Janner et al., ICML 2022\u003cbr\u003e\n[[PDF]](https://proceedings.mlr.press/v162/janner22a.html)\n\nA Hierarchical Bayesian Approach to Inverse Reinforcement Learning with Symbolic Reward Machines\u003cbr\u003e\nby Zhou et al., ICML 2022\u003cbr\u003e\n[[PDF]](https://proceedings.mlr.press/v162/zhou22b/zhou22b.pdf)\n\n## BDL and Graphs (Link Prediction, Graph Neural Networks, Knowledge Graphs, etc.)\n\nRelational Deep Learning: A Deep Latent Variable Model for Link Prediction\u003cbr\u003e\nby Wang et al., AAAI 2017\u003cbr\u003e\n[[PDF]](https://www.aaai.org/ocs/index.php/AAAI/AAAI17/paper/download/14346/14463)\n\nKnow-Evolve: Deep Temporal Reasoning for Dynamic Knowledge Graphs\u003cbr\u003e\nby Trivedi et al., ICML 2017\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/1705.05742.pdf)\n\nGraphite: Iterative Generative Modeling of Graphs\u003cbr\u003e\nby Grover et al., ICML 2019\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/1803.10459.pdf)\n\nRelational Variational Autoencoder for Link Prediction with Multimedia Data\u003cbr\u003e\nby Li et al., ACM MM 2017\u003cbr\u003e\n[[PDF]](https://dl.acm.org/citation.cfm?id=3126774)\n\nStochastic Blockmodels meet Graph Neural Networks\u003cbr\u003e\nby Mehta et al., ICML 2019\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/1905.05738.pdf)\n\nScalable Deep Generative Modeling for Sparse Graphs\u003cbr\u003e\nby Dai et al., ICML 2020\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/2006.15502.pdf)\n\nPGM-Explainer: Probabilistic Graphical Model Explanations for Graph Neural Networks\u003cbr\u003e\nby Vu et al., NIPS 2020\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/2010.05788.pdf)\n\nDirichlet Graph Variational Autoencoder\u003cbr\u003e\nby Li et al., NIPS 2020\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/2010.04408.pdf)\n\nBeta Embeddings for Multi-Hop Logical Reasoning in Knowledge Graphs\u003cbr\u003e\nby Ren et al., NIPS 2020\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/2010.11465.pdf)\n\nGeoDiff: a Geometric Diffusion Model for Molecular Conformation Generation\u003cbr\u003e\nby Xu et al., ICLR 2022\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/2203.02923.pdf)\n\nScore-based Generative Modeling of Graphs via the System of Stochastic Differential Equations\u003cbr\u003e\nby Jo et al., ICML 2022\u003cbr\u003e\n[[PDF]](https://proceedings.mlr.press/v162/jo22a.html)\n\nEquivariant Diffusion for Molecule Generation in 3D\u003cbr\u003e\nby Hoogeboom et al., ICML 2022\u003cbr\u003e\n[[PDF]](https://proceedings.mlr.press/v162/hoogeboom22a.html)\n\nLIMO: Latent Inceptionism for Targeted Molecule Generation\u003cbr\u003e\nby Eckmann et al., ICML 2022\u003cbr\u003e\n[[PDF]](https://proceedings.mlr.press/v162/eckmann22a/eckmann22a.pdf)\n\n3DLinker: An E(3) Equivariant Variational Autoencoder for Molecular Linker Design\u003cbr\u003e\nby Huang et al., ICML 2022\u003cbr\u003e\n[[PDF]](https://proceedings.mlr.press/v162/huang22g/huang22g.pdf)\n\nCrystal Diffusion Variational Autoencoder for Periodic Material Generation\u003cbr\u003e\nby Xie et al., ICLR 2022\u003cbr\u003e\n[[PDF]](https://openreview.net/pdf?id=03RLpj-tc_)\n\nOrphicX: A Causality-Inspired Latent Variable Model for Interpreting Graph Neural Networks\u003cbr\u003e\nby Lin et al., CVPR 2022\u003cbr\u003e\n[[PDF]](http://wanghao.in/paper/CVPR22_OrphicX.pdf)\n\n## BDL and Topic Modeling\n\nRelational Stacked Denoising Autoencoder for Tag Recommendation\u003cbr\u003e\nby Wang et al., AAAI 2015\u003cbr\u003e\n[[PDF]](https://www.aaai.org/ocs/index.php/AAAI/AAAI15/paper/download/9350/9980)\n\nScalable Deep Poisson Factor Analysis for Topic Modeling\u003cbr\u003e\nby Gan et al., ICML 2015\u003cbr\u003e\n[[PDF]](http://proceedings.mlr.press/v37/gan15.html)\n\nDeep Latent Dirichlet Allocation with Topic-layer-adaptive Stochastic Gradient Riemannian MCMC\u003cbr\u003e\nby Cong et al., ICML 2017\u003cbr\u003e\n[[PDF]](https://dl.acm.org/citation.cfm?id=3305471)\n\nDeep Unfolding for Topic Models\u003cbr\u003e\nby Chien et al., TPAMI 2017\u003cbr\u003e\n[[PDF]](https://ieeexplore.ieee.org/abstract/document/7869412/)\n\nNeural Relational Topic Models for Scientific Article Analysis\u003cbr\u003e\nby Bai et al., CIKM 2018\u003cbr\u003e\n[[PDF]](https://dl.acm.org/citation.cfm?id=3271696)\n\nDirichlet Belief Networks for Topic Structure Learning\u003cbr\u003e\nby Zhao et al., NIPS 2018\u003cbr\u003e\n[[PDF]](http://papers.nips.cc/paper/8020-dirichlet-belief-networks-for-topic-structure-learning)\n\nDeep Relational Topic Modeling via Graph Poisson Gamma Belief Network\u003cbr\u003e\nby Wang et al., NIPS 2020\u003cbr\u003e\n[[PDF]](https://proceedings.neurips.cc//paper/2020/hash/05ee45de8d877c3949760a94fa691533-Abstract.html)\n\nSawtooth Factorial Topic Embeddings Guided Gamma Belief Network\u003cbr\u003e\nby Duan et al., ICML 2021\u003cbr\u003e\n[[PDF]](http://proceedings.mlr.press/v139/duan21b/duan21b.pdf)\n\nPoisson-Randomised DirBN: Large Mutation is Needed in Dirichlet Belief Networks\u003cbr\u003e\nby Fan et al., ICML 2021\u003cbr\u003e\n[[PDF]](http://proceedings.mlr.press/v139/fan21a/fan21a.pdf)\n\nTorsional Diffusion for Molecular Conformer Generation\u003cbr\u003e\nby Jing et al., NIPS 2022\u003cbr\u003e\n[[PDF]](https://openreview.net/pdf?id=w6fj2r62r_H)\n\nKnowledge-Aware Bayesian Deep Topic Model\u003cbr\u003e\nby Wang et al., NIPS 2022\u003cbr\u003e\n[[PDF]](https://openreview.net/forum?id=N2AGw9s-wvX)\n\n## BDL and Speech Recognition/Synthesis\n\nUnsupervised Learning of Disentangled and Interpretable Representations from Sequential Data\u003cbr\u003e\nby Hsu et al., NIPS 2017\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/1709.07902.pdf)\n\nScalable Factorized Hierarchical Variational Autoencoder Training\u003cbr\u003e\nby Hsu et al., Interspeech 2018\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/1804.03201.pdf)\n\nHierarchical Generative Modeling for Controllable Speech Synthesis\u003cbr\u003e\nby Hsu et al., ICLR 2019\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/1810.07217.pdf)\n\nRecurrent Poisson Process Unit for Speech Recognition\u003cbr\u003e\nby Huang et al., AAAI 2019\u003cbr\u003e\n[[PDF]](https://pdfs.semanticscholar.org/4970/fa3189cd9a9c817ba72082e2f3d5fc9a7df1.pdf)\n\nDeep Graph Random Process for Relational-thinking-based Speech Recognition\u003cbr\u003e\nby Huang et al., ICML 2020\u003cbr\u003e\n[[PDF]](http://wanghao.in/paper/ICML20_DGP.pdf)\n\nDiffWave: A Versatile Diffusion Model for Audio Synthesis\u003cbr\u003e\nby Kong et al., ICLR 2021\u003cbr\u003e\n[[PDF]](https://arxiv.org/abs/2009.09761)\n\nWaveGrad: Estimating Gradients for Waveform Generation\u003cbr\u003e\nby Chen et al., ICLR 2021\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/2009.00713.pdf)\n\nGrad-TTS: A Diffusion Probabilistic Model for Text-to-Speech\u003cbr\u003e\nby Popov et al., ICML 2021\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/2105.06337.pdf)\n\nSTRODE: Stochastic Boundary Ordinary Differential Equation\u003cbr\u003e\nby Huang et al., ICML 2021\u003cbr\u003e\n[[PDF]](http://www.wanghao.in/paper/ICML21_STRODE.pdf)\n\nGuided-TTS: A Diffusion Model for Text-to-Speech via Classifier Guidance\u003cbr\u003e\nby Kim et al., ICML 2022\u003cbr\u003e\n[[PDF]](https://proceedings.mlr.press/v162/kim22d.html)\n\nDiffusion-Based Voice Conversion with Fast Maximum Likelihood Sampling Scheme\u003cbr\u003e\nby Popov et al., ICLR 2022\u003cbr\u003e\n[[PDF]](https://openreview.net/forum?id=8c50f-DoWAu)\n\nBDDM: Bilateral Denoising Diffusion Models for Fast and High-Quality Speech Synthesis\u003cbr\u003e\nby Lam et al., ICLR 2022\u003cbr\u003e\n[[PDF]](https://iclr.cc/virtual/2022/poster/6010)\n\nUnsupervised Mismatch Localization in Cross-Modal Sequential Data with Application to Mispronunciations Localization\u003cbr\u003e\nby Wei et al., TMLR 2022\u003cbr\u003e\n[[PDF]](http://wanghao.in/paper/TMLR22_ML-VAE.pdf)\n\n\n## BDL and Forecasting (Time Series Analysis)\n\nDeepAR: Probabilistic Forecasting with Autoregressive Recurrent Networks\u003cbr\u003e\nby Salinas et al., 2017\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/1704.04110.pdf)\n\nDeep State Space Models for Time Series Forecasting\u003cbr\u003e\nby Rangapuram et al., NIPS 2018\u003cbr\u003e\n[[PDF]](https://papers.nips.cc/paper/8004-deep-state-space-models-for-time-series-forecasting.pdf)\n\nDeep Factors for Forecasting\u003cbr\u003e\nby Wang et al., ICML 2019\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/1905.12417.pdf)\n\nProbabilistic Forecasting with Spline Quantile Function RNNs\u003cbr\u003e\nby Gasthaus et al., AISTATS 2019\u003cbr\u003e\n[[PDF]](http://proceedings.mlr.press/v89/gasthaus19a/gasthaus19a.pdf)\n\nAdversarial Attacks on Probabilistic Autoregressive Forecasting Models\u003cbr\u003e\nby Dang-Nhu et al., ICML 2020\u003cbr\u003e\n[[PDF]](https://proceedings.icml.cc/static/paper_files/icml/2020/526-Paper.pdf)\n\nNeural Jump Stochastic Differential Equations\u003cbr\u003e\nby Jia et al., NIPS 2019\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/1905.10403.pdf)\n\nSegmenting Hybrid Trajectories using Latent ODEs\u003cbr\u003e\nby Shi et al., ICML 2021\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/2105.03835.pdf)\n\nRNN with Particle Flow for Probabilistic Spatio-temporal Forecasting\u003cbr\u003e\nby Pal et al., ICML 2021\u003cbr\u003e\n[[PDF]](http://proceedings.mlr.press/v139/pal21b/pal21b.pdf)\n\nEnd-to-End Learning of Coherent Probabilistic Forecasts for Hierarchical Time Series\u003cbr\u003e\nby Rangapuram et al., ICML 2021\u003cbr\u003e\n[[PDF]](http://proceedings.mlr.press/v139/rangapuram21a/rangapuram21a.pdf)\n\nAutoregressive Denoising Diffusion Models for Multivariate Probabilistic Time Series Forecasting\u003cbr\u003e\nby Rasul et al., ICML 2021\u003cbr\u003e\n[[PDF]](http://proceedings.mlr.press/v139/rasul21a/rasul21a.pdf)\n\nDeep Explicit Duration Switching Models for Time Series\u003cbr\u003e\nby Ansari et al., NIPS 2021\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/2110.13878.pdf)\n\nAutoregressive Denoising Diffusion Models for Multivariate Probabilistic Time Series Forecasting\u003cbr\u003e\nby Rasul et al., ICML 2021\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/2101.12072.pdf)\n\nCSDI: Conditional Score-based Diffusion Models for Probabilistic Time Series Imputation\u003cbr\u003e\nby Tashiro et al., NIPS 2021\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/2107.03502.pdf)\n\nTACTiS: Transformer-Attentional Copulas for Time Series\u003cbr\u003e\nby Drouin et al., ICML 2022\u003cbr\u003e\n[[PDF]](https://proceedings.mlr.press/v162/drouin22a/drouin22a.pdf)\n\nReconstructing Nonlinear Dynamical Systems from Multi-Modal Time Series\u003cbr\u003e\nby Kramer et al., ICML 2022\u003cbr\u003e\n[[PDF]](https://proceedings.mlr.press/v162/kramer22a/kramer22a.pdf)\n\nDeep Variational Graph Convolutional Recurrent Network for Multivariate Time Series Anomaly Detection\u003cbr\u003e\nby Chen et al., ICML 2022\u003cbr\u003e\n[[PDF]](https://proceedings.mlr.press/v162/chen22x.html)\n\nVector Quantized Time Series Generation with a Bidirectional Prior Model\u003cbr\u003e\nby Lee at al., AISTATS 2023\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/2303.04743.pdf)\n\nSelf-Interpretable Time Series Prediction with Counterfactual Explanations\u003cbr\u003e\nby Yan et al., ICML 2023\u003cbr\u003e\n[[PDF]](http://wanghao.in/paper/ICML23_CounTS.pdf) [Cross Referenced in [BDL and Healthcare](https://github.com/js05212/BayesianDeepLearning-Survey/blob/master/README.md#bdl-and-healthcare)]\n\nCauDiTS: Causal Disentangled Domain Adaptation of Multivariate Time Series\u003cbr\u003e\nby Lu et al., ICML 2024\u003cbr\u003e\n[[PDF]](https://proceedings.mlr.press/v235/lu24i.html)\n\n## BDL and Distributed/Federated Learning\nStochastic Expectation Propagation\u003cbr\u003e\nby Li et al., NIPS 2015\u003cbr\u003e\n[[PDF]](https://papers.nips.cc/paper/2015/file/f3bd5ad57c8389a8a1a541a76be463bf-Paper.pdf)\n\n## BDL and AI4Science\nDirichlet Flow Matching with Applications to DNA Sequence Design\u003cbr\u003e\nby Stark et al., ICML 2024\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/2402.05841)\n\nParticle Guidance: non-I.I.D. Diverse Sampling with Diffusion Models\u003cbr\u003e\nby Corso et al., ICLR 2024\u003cbr\u003e\n[[PDF]](https://openreview.net/pdf?id=KqbCvIFBY7)\n\n## BDL and Continual/Life-Long Learning\nContinual Learning with Deep Generative Replay\u003cbr\u003e\nby Shin et al., NIPS 2017\u003cbr\u003e\n[[PDF]](https://proceedings.neurips.cc/paper/2017/file/0efbe98067c6c73dba1250d2beaa81f9-Paper.pdf)\n\nContinual Unsupervised Representation Learning\u003cbr\u003e\nby Rao et al., NIPS 2019\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/1910.14481.pdf)\n\nLife-Long Disentangled Representation Learning with Cross-Domain Latent Homologies\u003cbr\u003e\nby Achille et al., NIPS 2018\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/1808.06508.pdf)\n\nLearning Latent Representations Across Multiple Data Domains Using Lifelong VAEGAN\u003cbr\u003e\nby Ye et al., ECCV 2020\u003cbr\u003e\n[[PDF]](https://dl.acm.org/doi/abs/10.1007/978-3-030-58565-5_46)\n\nA Neural Dirichlet Process Mixture Model for Task-Free Continual Learning\u003cbr\u003e\nby Lee et al., ICLR 2020\u003cbr\u003e\n[[PDF]](https://arxiv.org/abs/2001.00689)\n\n## BDL as a Framework (Miscellaneous)\n\nTowards Bayesian Deep Learning: A Framework and Some Existing Methods\u003cbr\u003e\nby Wang et al., TKDE 2016\u003cbr\u003e\n[[PDF]](https://arxiv.org/abs/1608.06884)\n\nComposing Graphical Models with Neural Networks for Structured Representations and Fast Inference\u003cbr\u003e\nby Johnson et al., NIPS 2016\u003cbr\u003e\n[[PDF]](https://arxiv.org/abs/1603.06277)\n\nEnergy-Based Concept Bottleneck Models: Unifying Prediction, Concept Intervention, and Probabilistic Interpretations\u003cbr\u003e\nby Xu et al., ICLR 2024\u003cbr\u003e\n[[PDF]](http://wanghao.in/paper/ICLR24_ECBM.pdf)\n\n## Bayesian/Probabilistic Neural Networks as Building Blocks of BDL\n\nLearning Stochastic Feedforward Networks\u003cbr\u003e\nby Neal et al., Technical Report 1990\u003cbr\u003e\n[[PDF]](https://www.cs.toronto.edu/~hinton/absps/sff.pdf)\n\nA Practical Bayesian Framework for Backprop Networks\u003cbr\u003e\nby MacKay et al., Neural Computation 1992\u003cbr\u003e\n[[PDF]](https://pdfs.semanticscholar.org/b0f2/433c088591d265891231f1c22424047f1bc1.pdf)\n\nKeeping Neural Networks Simple by Minimizing the Description Length of the Weights\u003cbr\u003e\nby Hinton et al., COLT 1993\u003cbr\u003e\n[[PDF]](http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.44.3435)\n\nBayesian Learning via Stochastic Gradient Langevin Dynamics\u003cbr\u003e\nby Welling et al., ICML 2011\u003cbr\u003e\n[[PDF]](https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.441.3813\u0026rep=rep1\u0026type=pdf)\n\nPractical Variational Inference for Neural Networks\u003cbr\u003e\nby Alex Graves, NIPS 2011\u003cbr\u003e\n[[PDF]](https://papers.nips.cc/paper/4329-practical-variational-inference-for-neural-networks)\n\nAuto-Encoding Variational Bayes\u003cbr\u003e\nby Kingma et al., ArXiv 2014\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/1312.6114.pdf) [[Code]](https://github.com/AntixK/PyTorch-VAE)\n\nDeep Exponential Families\u003cbr\u003e\nby Ranganath et al., AISTATS 2015\u003cbr\u003e\n[[PDF]](https://arxiv.org/abs/1411.2581)\n\nWeight Uncertainty in Neural Networks\u003cbr\u003e\nby Blundell et al., ICML 2015\u003cbr\u003e\n[[PDF]](https://arxiv.org/abs/1505.05424)\n\nProbabilistic Backpropagation for Scalable Learning of Bayesian Neural Networks\u003cbr\u003e\nby Hernandez-Lobato et al., ICML 2015\u003cbr\u003e\n[[PDF]](http://proceedings.mlr.press/v37/hernandez-lobatoc15.pdf)\n\nVariational Dropout and the Local Reparameterization Trick\u003cbr\u003e\nby Kingma et al., NIPS 2015\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/1506.02557.pdf)\n\nThe Poisson Gamma Belief Network\u003cbr\u003e\nby Zhou et al., NIPS 2015\u003cbr\u003e\n[[PDF]](http://papers.nips.cc/paper/5645-the-poisson-gamma-belief-network)\n\nDeep Poisson Factor Modeling\u003cbr\u003e\nby Henao et al., NIPS 2015\u003cbr\u003e\n[[PDF]](http://papers.nips.cc/paper/5786-deep-poisson-factor-modeling)\n\nNatural-Parameter Networks: A Class of Probabilistic Neural Networks\u003cbr\u003e\nby Wang et al., NIPS 2016\u003cbr\u003e\n[[PDF]](http://wanghao.in/paper/NIPS16_NPN.pdf) [[Project Page]](https://github.com/js05212/NPN) [[Code]](https://github.com/js05212/NPN)\n\nAdversarial Variational Bayes: Unifying Variational Autoencoders and Generative Adversarial Networks\u003cbr\u003e\nby Mescheder et al., ICML 2017\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/1701.04722.pdf)\n\nStick-Breaking Variational Autoencoders\u003cbr\u003e\nby Nalisnick et al., ICLR 2017\u003cbr\u003e\n[[PDF]](https://openreview.net/forum?id=S1jmAotxg)\n\nBayesian GAN\u003cbr\u003e\nby Saatchi et al, NIPS 2017\u003cbr\u003e\n[[PDF]](https://arxiv.org/abs/1705.09558)\n\nNeural Expectation Maximization\u003cbr\u003e\nby Greff et al., NIPS 2017\u003cbr\u003e\n[[PDF]](https://papers.nips.cc/paper/7246-neural-expectation-maximization.pdf)\n\nLightweight Probabilistic Deep Networks\u003cbr\u003e\nby Gast et al., CVPR 2018\u003cbr\u003e\n[[PDF]](http://openaccess.thecvf.com/content_cvpr_2018/html/Gast_Lightweight_Probabilistic_Deep_CVPR_2018_paper.html)\n\nFeed-forward Propagation in Probabilistic Neural Networks with Categorical and Max Layers\u003cbr\u003e\nby Shekhovtsov et al., ICLR 2018\u003cbr\u003e\n[[PDF]](https://openreview.net/forum?id=SkMuPjRcKQ)\n\nGlow: Generative Flow with Invertible 1x1 Convolutions\u003cbr\u003e\nby Kingma et al., NIPS 2018\u003cbr\u003e\n[[PDF]](https://papers.nips.cc/paper/8224-glow-generative-flow-with-invertible-1x1-convolutions.pdf)\n\nEvidential Deep Learning to Quantify Classification Uncertainty\u003cbr\u003e\nby Sensoy et al., NIPS 2018\u003cbr\u003e\n[[PDF]](https://papers.nips.cc/paper_files/paper/2018/file/a981f2b708044d6fb4a71a1463242520-Paper.pdf)\n\nProbGAN: Towards Probabilistic GAN with Theoretical Guarantees\u003cbr\u003e\nby He et al., ICLR 2019\u003cbr\u003e\n[[PDF]](http://wanghao.in/paper/ICLR19_ProbGAN.pdf) [[Project Page]](https://github.com/hehaodele/ProbGAN)\n\nSampling-free Epistemic Uncertainty Estimation Using Approximated Variance Propagation\u003cbr\u003e\nby Postels et al., ICCV 2019\u003cbr\u003e\n[[PDF]](https://arxiv.org/abs/1908.00598)\n\nEfficient and Scalable Bayesian Neural Nets with Rank-1 Factors\u003cbr\u003e\nby Dusenberry et al., ICML 2020\u003cbr\u003e\n[[PDF]](https://proceedings.icml.cc/static/paper_files/icml/2020/5657-Paper.pdf)\n\nNeural Clustering Processes\u003cbr\u003e\nby Pakman et al., ICML 2020\u003cbr\u003e\n[[PDF]](https://proceedings.icml.cc/static/paper_files/icml/2020/3997-Paper.pdf)\n\nBeing Bayesian, Even Just a Bit, Fixes Overconfidence in ReLU Networks\u003cbr\u003e\nby Kristiadi et al., ICML 2020\u003cbr\u003e\n[[PDF]](http://proceedings.mlr.press/v119/kristiadi20a/kristiadi20a.pdf)\n\nActivation-level Uncertainty in Deep Neural Networks\u003cbr\u003e\nby Morales-Alvarez et al., ICLR 2021\u003cbr\u003e\n[[PDF]](https://openreview.net/pdf/6d7935927e30fe5bf2be87f8e871229560145392.pdf)\n\nBayesian Deep Learning via Subnetwork Inference\u003cbr\u003e\nby Daxberger et al., ICML 2021\u003cbr\u003e\n[[PDF]](http://proceedings.mlr.press/v139/daxberger21a/daxberger21a.pdf)\n\nOn the Pitfalls of Heteroscedastic Uncertainty Estimation with Probabilistic Neural Networks\u003cbr\u003e\nby Seitzer et al., ICLR 2022\u003cbr\u003e\n[[PDF]](https://openreview.net/pdf?id=aPOpXlnV1T)\n\nEvidential Turing Processes\u003cbr\u003e\nby Kandemir et al., ICLR 2022\u003cbr\u003e\n[[PDF]](https://openreview.net/pdf?id=84NMXTHYe-)\n\nHow Tempering Fixes Data Augmentation in Bayesian Neural Networks\u003cbr\u003e\nby Bachmann et al., ICML 2022\u003cbr\u003e\n[[PDF]](https://proceedings.mlr.press/v162/bachmann22a/bachmann22a.pdf)\n\nSIMPLE: A Gradient Estimator for k-Subset Sampling\u003cbr\u003e\nby Ahmed et al., ICLR 2023\u003cbr\u003e\n[[PDF]](https://openreview.net/forum?id=GPJVuyX4p_h)\n\nCollapsed Inference for Bayesian Deep Learning\u003cbr\u003e\nby Zeng et al., NeurIPS 2023\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/2306.09686.pdf)\n\nVariational Imbalanced Regression: Fair Uncertainty Quantification via Probabilistic Smoothing\u003cbr\u003e\nby Wang et al., NeurIPS 2023\u003cbr\u003e\n[[PDF]](http://www.wanghao.in/paper/NIPS23_VIR.pdf)\n\nBLoB: Bayesian Low-Rank Adaptation by Backpropagation for Large Language Models\u003cbr\u003e\nby Wang et al., NeurIPS 2024\u003cbr\u003e\n[[PDF]](https://arxiv.org/pdf/2406.11675)\n","funding_links":[],"categories":["Machine Learning (ML) and Data Mining (DM)","Others","Uncategorized"],"sub_categories":["Uncategorized"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fjs05212%2FBayesianDeepLearning-Survey","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fjs05212%2FBayesianDeepLearning-Survey","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fjs05212%2FBayesianDeepLearning-Survey/lists"}