View Single Post
  #1  
Old 04-28-2013, 05:19 PM
mvellon mvellon is offline
Junior Member
 
Join Date: Apr 2013
Posts: 9
Default *ANSWER* Stuck on #4

To calculate \bar{g} I generated several data sets each of two points where y=sin(pi*x). For each dataset, I generated a hypothesis (a slope a) that minimized the squared error for each of the two points. I did this by calculating the error as (ax_1-y_1)^2+(ax_2-y_2)^2, differentiating with respect to a, setting the result to zero and solving for a. This gave me a=(x_1y_1+x_2y_2)/(x_1^2+y_1^2). If I then average my per-dataset slopes (the a's), I get 1.42.

This seems wrong, not only as it's not an available choice () but also because it does not yield a smaller bias than, for example, .79.

I've seen suggestions to use linear regression to calculate the a's, but I don't think that's where I'm going wrong (not only that, but I'm not sure how to do the linear regression without an intercept term).
Reply With Quote