View Single Post
Old 02-18-2013, 11:57 PM
yaser's Avatar
yaser yaser is offline
Join Date: Aug 2009
Location: Pasadena, California, USA
Posts: 1,478
Default Re: On Bayesian learning

Thank you for opening this discusion.

The question of using a prior to model an unknown quantity is a key question. On the one hand, not all situations that involve an unknown quantity are probabilistic ones. While this last statement can be debated both ways in a practical situation, there are instances where this is self-evident. If you take Chaitin's number \Omega, which provably exists and is unique (for a specific universal Turing Machine), but also (provably) cannot be identified, it provides a case where the prior suggested in the lecture (and articulated by you in terms of a hyperparameter) is patently the only correct one.

Some may view the hyperparameter approach as a legitimate way of fitting the situation in a probabilistic setup, and some may view it as "passing the buck" of the notion of unknown to the hyperparameter, making the prior itself effectively meaningless. Regardless of one's views in this matter, what is clear is that equating being unknown with having a uniform prior, which seems to be common practice in the Bayesian world, is fundamentally flawed.
Where everyone thinks alike, no one thinks very much
Reply With Quote