Bayesian Augmentation of Deep Learning to Improve Video Classification
Document Type
Conference Proceeding
Publication Date
1-2023
Abstract
Traditional automated video classification methods lack measures of uncertainty, meaning the network is unable to identify those cases in which its predictions are made with significant uncertainty. This leads to misclassification, as the traditional network classifies each observation with same amount of certainty, no matter what the observation is. Bayesian neural networks are a remedy to this issue by leveraging Bayesian inference to construct uncertainty measures for each prediction. Because exact Bayesian inference is typically intractable due to the large number of parameters in a neural network, Bayesian inference is approximated by utilizing dropout in a convolutional neural network. This research compared a traditional video classification neural network to its Bayesian equivalent based on performance and capabilities. The Bayesian network achieves higher accuracy than a comparable non-Bayesian video network and it further provides uncertainty measures for each classification.
DOI
http://hdl.handle.net/10125/79596
Source Publication
Proceedings of the 55th Hawaii International Conference on System Sciences
Recommended Citation
Swize, E., Champagne, L., Cox, B., & Bihl, T. (2022, January). Bayesian augmentation of deep learning to improve video classification. Proceedings of the 55th Hawaii International Conference on System Sciences. http://hdl.handle.net/10125/79596
Comments
The "Link to Full Text" on this page will open or save the PDF of the conference paper, hosted at the conference website.
This is an open access conference paper published and distributed under the terms of the Creative Commons Attribution-NonCommercial-NoDerivatives License, which permits non-commercial re-use, distribution, and reproduction in any medium, provided the original work is properly cited, and is not altered, transformed, or built upon in any way. CC BY-NC-ND 4.0
Please fully cite the work and Creative Commons license in any reuse.