View Single Post
  #2  
Old 02-09-2013, 02:01 PM
yaser's Avatar
yaser yaser is offline
Caltech
 
Join Date: Aug 2009
Location: Pasadena, California, USA
Posts: 1,477
Default Re: lecture 8: understanding bias

Quote:
Originally Posted by ilya239 View Post
The VC dimension is single number that is a property of the hypothesis set.
But, what is "bias of a hypothesis set"? Bias seems to depend also on dataset size and the learning algorithm, since it depends on \bar{g}(x) = \mathbb{E}_\mathcal{D}[g^{(\mathcal{D})}(x)]; g^{(\mathcal{D})}(x) depends on the learning algorithm, and the set of datasets over which the expectation is taken depends on dataset size.
Your observation is correct that the bias-variance analysis is not as general as the VC analysis. The bias does depend on the learning algorithm. It also depends on the number of examples, usually slightly.

Quote:
Slide 4 says that bias measures "how well \mathcal{H} can approximate f". Does this mean "with a sufficiently large dataset and a perfect learning algorithm"?
Is the bias of a (hypothesis set, learning algorithm) combination a single value -- the asymptote of the learning curve? Or is there some notion of bias that is a property of a hypothesis set by itself? If the hypothesis set contains the target function, that does not mean the bias is zero, does it? The beginning of the lecture seems to imply otherwise, but if there is no restriction on the learning algorithm, what guarantees that the average function will in fact be close to the target function for large enough dataset size?
Or is it assumed that the learning algorithm always picks a hypothesis which minimizes E_{in}?
Good questions . What you are saying would hold if we were using the best approximation of f in {\cal H} as the vehicle for measuring the bias. We are not. We are using a "limited resource" version of it that is based on averaging hypotheses that we get from training on a finite set of data points. This version is often close to the best approximation so that's why we can take that liberty.
__________________
Where everyone thinks alike, no one thinks very much
Reply With Quote