LFD Book Forum Why would variance be non-zero?
 User Name Remember Me? Password
 Register FAQ Calendar Mark Forums Read

 Thread Tools Display Modes
#1
08-01-2012, 05:41 PM
 samirbajaj Member Join Date: Jul 2012 Location: Silicon Valley Posts: 48
Why would variance be non-zero?

In question 6 on the homework, we are asked to compute the variance across all data sets.

If we are sampling uniformly from the interval [-1, 1] for the calculation of g_bar, as well as for each data set (g_d), why would the variance be anything but a very small quantity? In the general case, when the data sets are not drawn from a uniform distribution, a non-zero variance makes sense, but if there is sufficient overlap in the data sets, it makes intuitive sense that the variance should be close to zero.

I ask this because my simulation results support the above (potentially flawed) theory.

Please answer the question in general terms -- I don't care about the homework answer -- I was merely using that as an example.

Thanks for any input.

-Samir

 Tags variance