Just reading this, worth thinking about. Technical.
Neural networks are fundamentally Bayesian in TowardsDataScience By Chris Mingard
Stochastic Gradient Descent approximates Bayesian sampling
Deep neural networks (DNNs) have been extraordinarily successful in many different situations — from image recognition and playing chess to driving cars and making medical diagnoses. However, in spite of this success, a good theoretical understanding of why they generalise (learn) so well is still lacking.
In this post, we summarise results from three papers, which provide a candidate for a theory of generalisation in DNNs [1,2,3]. .... "
No comments:
Post a Comment