View Single Post
  #7  
Old 06-14-2015, 02:03 PM
prithagupta.nsit prithagupta.nsit is offline
Junior Member
 
Join Date: Jun 2015
Posts: 7
Default Re: Bias-Variance Analysis

Hello,

I have a few questions if we consider the following model:
Suppose instances x are distributed uniformly in X = [0; 10] and outputs are given by
y = f (x) + e  = x + e;
where  e is an error term with a standard normal distribution.

Now to analyse the decomposition of the generalization error into bias + variance + noise by generating random samples of size N = 10, fitting the models gi, and determining the predictions and prediction errors for x = 0, 1/100,.....,10.

1. During calculations of gbar ,bias and variance won't it be wrong to not consider error during the generation of data sets? if not why?

2. How can we calculate noise separately for the polynomial hypothesis?

3. My understanding to calculate the predictions and prediction errors:
Predictions would be the value given by function gbar on x and prediction error would be the difference of that value from the value generated by function f(x). Am I correct?

Looking forward to a reply
Reply With Quote