View Single Post
  #1  
Old 08-01-2012, 05:41 PM
samirbajaj samirbajaj is offline
Member
 
Join Date: Jul 2012
Location: Silicon Valley
Posts: 48
Default Why would variance be non-zero?

In question 6 on the homework, we are asked to compute the variance across all data sets.

If we are sampling uniformly from the interval [-1, 1] for the calculation of g_bar, as well as for each data set (g_d), why would the variance be anything but a very small quantity? In the general case, when the data sets are not drawn from a uniform distribution, a non-zero variance makes sense, but if there is sufficient overlap in the data sets, it makes intuitive sense that the variance should be close to zero.

I ask this because my simulation results support the above (potentially flawed) theory.

Please answer the question in general terms -- I don't care about the homework answer -- I was merely using that as an example.

Thanks for any input.

-Samir
Reply With Quote