![]() |
|
#1
|
|||
|
|||
![]()
The VC dimension is single number that is a property of the hypothesis set.
But, what is "bias of a hypothesis set"? Bias seems to depend also on dataset size and the learning algorithm, since it depends on ![]() ![]() Slide 4 says that bias measures "how well ![]() ![]() Is the bias of a (hypothesis set, learning algorithm) combination a single value -- the asymptote of the learning curve? Or is there some notion of bias that is a property of a hypothesis set by itself? If the hypothesis set contains the target function, that does not mean the bias is zero, does it? The beginning of the lecture seems to imply otherwise, but if there is no restriction on the learning algorithm, what guarantees that the average function will in fact be close to the target function for large enough dataset size? Or is it assumed that the learning algorithm always picks a hypothesis which minimizes ![]() |
#2
|
||||
|
||||
![]() Quote:
Quote:
![]() ![]() ![]()
__________________
Where everyone thinks alike, no one thinks very much |
#3
|
|||
|
|||
![]() Quote:
In HW4 #4 the average hypothesis is measurably shifted from the hypothesis set member giving the lowest mean squared error. Probably because two-point dataset is too small, i.e. this is not representative of realistic cases? |
#4
|
||||
|
||||
![]()
Indeed, the fewer the number of points, the more likely that the average hypothesis will differ from the best approximation. The difference tends to be small, though.
__________________
Where everyone thinks alike, no one thinks very much |
#5
|
|||
|
|||
![]()
Well, it is also that the two point data set is small relative to the two parameter hypotheses. If you have 100 points, and 99th degree polynomials, it would also have large variance. I will guess that minimizing bias plus variance happens with the number of fit parameters near the square root of the number of points per data set.
|
#6
|
|||
|
|||
![]() Quote:
On the other hand, I'm not sure how to prove that it won't be far ![]() |
#7
|
||||
|
||||
![]() Quote:
__________________
Where everyone thinks alike, no one thinks very much |
![]() |
Tags |
bias, lecture 8 |
Thread Tools | |
Display Modes | |
|
|