View Single Post
Old 06-19-2015, 12:38 AM
yaser's Avatar
yaser yaser is offline
Join Date: Aug 2009
Location: Pasadena, California, USA
Posts: 1,476
Default Re: Bias-Variance Analysis

Originally Posted by prithagupta.nsit View Post

I have a few questions if we consider the following model:
Suppose instances x are distributed uniformly in X = [0; 10] and outputs are given by
y = f (x) + e  = x + e;
where  e is an error term with a standard normal distribution.

Now to analyse the decomposition of the generalization error into bias + variance + noise by generating random samples of size N = 10, fitting the models gi, and determining the predictions and prediction errors for x = 0, 1/100,.....,10.

1. During calculations of gbar ,bias and variance won't it be wrong to not consider error during the generation of data sets? if not why?

2. How can we calculate noise separately for the polynomial hypothesis?

3. My understanding to calculate the predictions and prediction errors:
Predictions would be the value given by function gbar on x and prediction error would be the difference of that value from the value generated by function f(x). Am I correct?

Looking forward to a reply
Would you clarify some points as I didn't quite understand the questions? First, I take it that what you referred to as model is the target function (target distribution in this noisy case). If so, what is the learning model (hypothesis set) you are using? Perhaps you can rephrase your three questions after you define the model.
Where everyone thinks alike, no one thinks very much
Reply With Quote