LFD Book Forum

LFD Book Forum (http://book.caltech.edu/bookforum/index.php)
-   Chapter 4 - Overfitting (http://book.caltech.edu/bookforum/forumdisplay.php?f=111)
-   -   Variance of Eval (http://book.caltech.edu/bookforum/showthread.php?t=2377)

axelrv 10-20-2012 10:32 AM

Variance of Eval
 
I'm confused about how to simplify expressions involving Var[Eval(g-)].

I know that Var[Eval(g-)] = E [ ( Eval(g-) - E[Eval(g-)] )^2] = E [ ( Eval(g-) - Eout(g-) )^2] and that for classification P[g-(x) != y] = Eout(g-). I'm not sure how to bring K into any of these expressions.

Any help would be greatly appreciated.

magdon 10-21-2012 07:51 AM

Re: Variance of Eval
 
Here are two useful facts from probability:

The variance of a sum of independent terms is the sum of the variances:
Var\left(\sum_{k=1}^K X_k\right)=\sum_{k=1}^KVar(X_k)

When you scale a random quantity its variance scales quadratically:
Var(aX)=a^2Var(X)

[Hint: so, if you scale something by \frac{1}{K} its variance scales by \frac{1}{K^2}; the validation error is the average of K independent things (What things? Why are they independent?)]

Quote:

Originally Posted by axelrv (Post 6659)
I'm confused about how to simplify expressions involving Var[Eval(g-)].

I know that Var[Eval(g-)] = E [ ( Eval(g-) - E[Eval(g-)] )^2] = E [ ( Eval(g-) - Eout(g-) )^2] and that for classification P[g-(x) != y] = Eout(g-). I'm not sure how to bring K into any of these expressions.

Any help would be greatly appreciated.



All times are GMT -7. The time now is 06:04 AM.

Powered by vBulletin® Version 3.8.3
Copyright ©2000 - 2019, Jelsoft Enterprises Ltd.
The contents of this forum are to be used ONLY by readers of the Learning From Data book by Yaser S. Abu-Mostafa, Malik Magdon-Ismail, and Hsuan-Tien Lin, and participants in the Learning From Data MOOC by Yaser S. Abu-Mostafa. No part of these contents is to be communicated or made accessible to ANY other person or entity.