2016 DropoutAsaBayesianApproximation

From GM-RKB
Jump to navigation Jump to search

Subject Headings: Dropout_(neural_networks).

Notes

Cited By

Quotes

Abstract

Deep learning tools have gained tremendous attention in applied machine learning. However such tools for regression and classification do not capture model uncertainty. In comparison, Bayesian models offer a mathematically grounded framework to reason about model uncertainty, but usually come with a prohibitive computational cost. In this paper we develop a new theoretical framework casting dropout training in deep neural networks (NNs) as approximate Bayesian inference in deep Gaussian processes. A direct result of this theory gives us tools to model uncertainty with dropout NNs - extracting information from existing models that has been thrown away so far. This mitigates the problem of representing uncertainty in deep learning without sacrificing either computational complexity or test accuracy. We perform an extensive study of the properties of dropout's uncertainty. Various network architectures and nonlinearities are assessed on tasks of regression and classification, using MNIST as an example. We show a considerable improvement in predictive log-likelihood and RMSE compared to existing state-of-the-art methods, and finish by using dropout's uncertainty in deep reinforcement learning.

References

;

 AuthorvolumeDate ValuetitletypejournaltitleUrldoinoteyear
2016 DropoutAsaBayesianApproximationZoubin Ghahramani
Yarin Gal
Dropout As a Bayesian Approximation: Representing Model Uncertainty in Deep Learning