View Single Post
Old 08-09-2012, 03:03 AM
yaser's Avatar
yaser yaser is offline
Join Date: Aug 2009
Location: Pasadena, California, USA
Posts: 1,478
Default Re: Question on Bias Variance Tradeoff

Originally Posted by hashable View Post
Thanks for the quick reply. I have some follow up questions in regards to variance.

How does increasing bias artificially in this way (by choosing a pathological hypothesis space) affect the variance?

Variance appears to only depend on and is independent of f. Perhaps it could be considered to indirectly depend of f to the extent that each g tries to approximate f. Thus is variance affected by whether is close to f or not?

Is it possible to increase complexity/hypothesis-set-size without increasing the variance? It is not obvious that this is not possible although the intuitive explanation is that a larger hypothesis set will result in a larger variance.
The dependency of the variance on the target is, as you point out. more complicated. For instance, if you try to learn the constant function, most models will converge with litlle variance, whereas a more complex target will result in bigger variance with the same models. The intuition beyond the bias and variance is valid for a lot of situations, but may not be valid for some.
Where everyone thinks alike, no one thinks very much
Reply With Quote