Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/otokonoko8/deep-Bayesian-nonparametrics-papers

The collection of papers about combining deep learning and Bayesian nonparametrics
https://github.com/otokonoko8/deep-Bayesian-nonparametrics-papers

Last synced: about 2 months ago
JSON representation

The collection of papers about combining deep learning and Bayesian nonparametrics

Awesome Lists containing this project

README

        

# deep-Bayesian-nonparametrics-papers
*The collection of papers about combining deep learning with Bayesian nonparametric approaches*

We made a concise name "deep Bayesian non-parametrics"(DBNP) to a series of work bringing the fields of deep learning and Bayesian nonparametrics together. Generally, not only DBNP means combining the neural networks with stochastic processes in Bayesian modelling, but also leveraging common and effective structures of deep learning, such as convolution, recurrence and deep hierachies in the setting of Bayesian nonparameterics, introducing nonparametric methods into structure design of neural nets, and reinterpreting neural nets as Bayesian nonparametric models from any perspective. Meanwhile, corresponding training methods designed for these models, especially approximate inference, are also our concerns.

### Deep Gaussian Processes, Inference Algorithms and Applications
1. [Deep Gaussian Processes](https://arxiv.org/abs/1211.0358)
2. [Nested Variational Compression in Deep Gaussian Processes](https://arxiv.org/abs/1412.1370)
3. [Training Deep Gaussian Processes using Stochastic Expectation Propagation and Probabilistic Backpropagation](https://arxiv.org/abs/1511.03405)
4. [Variational Auto-encoded Deep Gaussian Processes](https://arxiv.org/abs/1511.06455)
5. [Deep Gaussian Processes for Regression using Approximate Expectation Propagation](https://arxiv.org/abs/1602.04133)
6. [Random Feature Expansions for Deep Gaussian Processes](https://arxiv.org/abs/1610.04386)
7. [Doubly Stochastic Variational Inference for Deep Gaussian Processes](https://arxiv.org/abs/1705.08933)
8. [Deep Gaussian Processes with Decoupled Inducing Inputs](https://arxiv.org/abs/1801.02939)
9. [Deep Gaussian Processes with Convolutional Kernels](https://arxiv.org/abs/1806.01655)
10. [Inference in Deep Gaussian Processes using Stochastic Gradient Hamiltonian Monte Carlo](https://arxiv.org/abs/1806.05490)
11. [Efficient Global Optimization using Deep Gaussian Processes](http://arxiv.org/abs/1809.04632)
12. [Deep Convolutional Gaussian Processes](https://arxiv.org/abs/1810.03052)
13. [Deep Gaussian Processes for Multi-fidelity Modeling](http://www.eurecom.fr/en/publication/5755/download/comsys-publi-5755.pdf)
14. [Deep Gaussian Processes with Importance-Weighted Variational Inference](https://arxiv.org/abs/1905.05435)
15. [Compositional Uncertainty in Deep Gaussian Processes](https://arxiv.org/abs/1909.07698)
16. [Implicit Posterior Variational Inference for Deep Gaussian Processes](http://arxiv.org/abs/1910.11998)

### Reinterpretation of Neural Networks as Bayesian Nonparametric Models
1. [Deep Bayesian Neural Nets as Deep Matrix Gaussian Processes](https://drive.google.com/file/d/0Bx3kAuASMMrnTmIzV255S3laM1k/view)
1. [Deep Neural Networks as Gaussian Processes](http://arxiv.org/abs/1711.00165)
2. [Gaussian Process Behaviour in Wide Deep Neural Networks](http://arxiv.org/abs/1804.11271)
3. [Deep Convolutional Networks as Shallow Gaussian Processes](http://arxiv.org/abs/1808.05587)
4. [Bayesian Convolutional Neural Networks with Many Channels are Gaussian Processes](https://openreview.net/pdf?id=B1g30j0qF7)
5. [On the Connection between Neural Processes and Gaussian Processes with Deep Kernels](http://bayesiandeeplearning.org/2018/papers/128.pdf)
6. [Approximate Inference Turns Deep Networks into Gaussian Processes](https://arxiv.org/abs/1906.01930)
7. [Non-Gaussian Processes and Neural Networks at Finite Widths](https://openreview.net/forum?id=HygP3TVFvS&noteId=HygP3TVFvS)
8. [Wide Feedforward or Recurrent Neural Networks of Any Architecture are Gaussian Processes](https://arxiv.org/abs/1910.12478)

### Gaussian Processes with Neural-network-inspired-structures and Inference Algorithms
1. [Recurrent Gaussian Processes](https://arxiv.org/abs/1511.06644)
2. [Deep Recurrent Gaussian Process with Variational Sparse Spectrum Approximation](https://arxiv.org/abs/1711.00799)
3. [Convolutional Gaussian Processes](https://arxiv.org/abs/1709.01894)
4. [Overcoming Mean-Field Approximations in Recurrent Gaussian Process Models](https://arxiv.org/abs/1906.05828)

### Gaussian Processes Inputs Transformed by Deep Architecture
1. [Deep Kernel Learning](http://arxiv.org/abs/1511.02222)
2. [Learning Scalable Deep Kernels with Recurrent Structure](http://arxiv.org/abs/1610.08936)
3. [Stochastic Variational Deep Kernel Learning](http://arxiv.org/abs/1611.00336)
4. [Semi-supervised Deep Kernel Learning: Regression with Unlabeled Data by Minimizing Predictive Variance](http://arxiv.org/abs/1805.10407)
5. [Calibrating Deep Convolutional Gaussian Processes](https://arxiv.org/abs/1805.10522)
6. [Differentiable Compositional Kernel Learning for Gaussian Processes](http://arxiv.org/abs/1806.04326)
7. [Deep Learning with Differential Gaussian Process Flows](http://arxiv.org/abs/1810.04066)
8. [Finite Rank Deep Kernel Learning](http://bayesiandeeplearning.org/2018/papers/98.pdf)
9. [Adaptive Deep Kernel Learning](https://arxiv.org/abs/1905.12131)

### Bayesian Nonparametric Neural Latent Variable Models/ Amortised Inference with Nonparamteric Priors
1. [Stick-breaking Variational Autoencoders](http://arxiv.org/abs/1605.06197)
2. [Indian Buffet Process Deep Generative Models](http://vixra.org/pdf/1607.0073v2.pdf)
3. [Nonparametric Variational Autoencoders for Hierarchical Representation Learning](http://arxiv.org/abs/1703.07027)
4. [Nonparametric Bayesian Deep Networks with Local Competition](http://arxiv.org/abs/1805.07624)
5. [A Bayesian Nonparametric Topic Model with Variational Auto-encoders](https://openreview.net/pdf?id=SkxqZngC-)
6. [Deep Bayesian Nonparametric Tracking](http://www.columbia.edu/~jwp2128/Papers/ZhangPaisley2018.pdf)
7. [Gaussian Process Prior Variational Autoencoders](http://www.mit.edu/~adalca/files/papers/gppvae-arxiv-draft.pdf)
8. [Deep Generative Model with Beta Bernoulli Process for Modeling and Learning Confounding Factors](https://arxiv.org/pdf/1811.00073.pdf)
9. [Stick-breaking Neural Latent Variable Models](https://drive.google.com/file/d/1nJsEcTZ9bsAh2rdRWWu1QLLyMpXLBw4c/view)
10. [Deep Bayesian Nonparametric Factor Analysis](http://bayesiandeeplearning.org/2018/papers/131.pdf)
11. [Deep Factors with Gaussian Processes for Forecasting](http://bayesiandeeplearning.org/2018/papers/112.pdf)

### Bayesian Nonparametric Neural Networks/ Approximate Inference with Implicit Stochastic Processes as Priors
1. [Variational Implict Processes](http://arxiv.org/abs/1806.02390)
2. [Functional Variational Bayesian Neural Networks](https://openreview.net/pdf?id=rkxacs0qY7)
3. [Functional Bayesian Neural Networks for Model Uncertainty Quantification](https://openreview.net/pdf?id=SJxFN3RcFX)
4. [Functional Space Particle Optimization for Bayesian Neural Networks](https://openreview.net/pdf?id=BkgtDsCcKQ)
5. [Characterizing and Warping the Function space of Bayesian Neural Networks](https://danielflamshep.github.io/158.pdf)

### Neural Networks Meta-Learning/ Hyperparameter-tuning via Bayesian Nonparametric Approaches
1. [Mapping Gaussian Process Priors to Bayesian Neural Networks](http://bayesiandeeplearning.org/2017/papers/65.pdf)
2. [Nonparametric Bayesian Deep Networks with Local Competition](http://arxiv.org/abs/1805.07624)
3. [Neural Architecture Search with Bayesian Optimisation and Optimal Transport](https://arxiv.org/abs/1802.07191)
4. [Gaussian Process Neurons](https://openreview.net/pdf?id=By-IifZRW)
5. [Characterizing and Warping the Function space of Bayesian Neural Networks](https://danielflamshep.github.io/158.pdf)