LFD Book Forum (http://book.caltech.edu/bookforum/index.php)
-   Homework 4 (http://book.caltech.edu/bookforum/forumdisplay.php?f=133)
-   -   Q4) h(x) = ax (http://book.caltech.edu/bookforum/showthread.php?t=959)

 itooam 08-07-2012 03:57 AM

Q4) h(x) = ax

This question is similar to that in the lectures i.e.,

in the lecture H1 equals

h(x) = ax + b

Is this question different to the lecture in the respect we shouldn't add "b" (i.e., X0 the bias/intercept) when applying? Or should I treat the same?

My confusion is because in many papers etc a bias/intercept is assumed even if not specified i.e., h(x) = ax could be considered the same as h(x) = ax + b

 yaser 08-07-2012 04:24 AM

Re: Q4) h(x) = ax

Quote:
 Originally Posted by itooam (Post 3857) This question is similar to that in the lectures i.e., in the lecture H1 equals h(x) = ax + b Is this question different to the lecture in the respect we shouldn't add "b" (i.e., X0 the bias/intercept) when applying? Or should I treat the same? My confusion is because in many papers etc a bias/intercept is assumed even if not specified i.e., h(x) = ax could be considered the same as h(x) = ax + b
There is no bias/intercept in this problem, only the slope (one parameter which is ).

 itooam 08-07-2012 04:36 AM

Re: Q4) h(x) = ax

Thanks for comfirmation, much appreciated :)

 geekoftheweek 01-31-2013 10:16 AM

Re: Q4) h(x) = ax

Is there a best way to minimize the mean-squared error? I am doing gradient descent with a very low learning rate (0.00001) and my solution is diverging! not converging. Is it not feasible to do gradient descent with two points when approximating a sine?
Thanks

 geekoftheweek 01-31-2013 11:09 AM

Re: Q4) h(x) = ax

Never mind, I got my solution to converge, though I do not trust my answer. Oh well.

 sanbt 01-31-2013 03:34 PM

Re: Q4) h(x) = ax

Quote:
 Originally Posted by geekoftheweek (Post 9088) Never mind, I got my solution to converge, though I do not trust my answer. Oh well.
You can use linear regression to calculate each hypothesis.
(since linear regression is basically analytical formula for minimizing mean square error).

Also, you can confirm if your g_bar from simulation makes sense by calculate it directly. (calculate expectation of the hypothesis from each (x1,x2) over [-1,1] x [-1,1] ). This involves two integrals but you can plug in the expression to wolfram or mathematica.

 melipone 02-01-2013 06:49 AM

Re: Q4) h(x) = ax

I thought it would simply be (y1/x1 + y2/x2)/2 to find an a that minimizes the mean square error on two points, no?

 Anne Paulson 02-01-2013 10:36 AM

Re: Q4) h(x) = ax

So, in this procedure we:

Pick two points;
Find the best slope for those two points, the one that minimizes the squared error for those two points;
Do this N times and average all the s

Rather than:

Pick two points;
Calculate the squared error for those two points as a function of ;
Do this N times, then find the that minimizes the sum of all of the squared errors, as we do with linear regression

Are we doing the first thing here or the second thing? Either way there's a simple analytic solution, but I'm not sure which procedure we're doing.

 yaser 02-01-2013 11:19 AM

Re: Q4) h(x) = ax

Quote:
 Originally Posted by Anne Paulson (Post 9109) So, in this procedure we: Pick two points; Find the best slope for those two points, the one that minimizes the squared error for those two points; Do this N times and average all the s Rather than: Pick two points; Calculate the squared error for those two points as a function of ; Do this N times, then find the that minimizes the sum of all of the squared errors, as we do with linear regression Are we doing the first thing here or the second thing? Either way there's a simple analytic solution, but I'm not sure which procedure we're doing.
The first method estimates for the average hypothesis (which takes into consideration only two points at a time). The second method estimates for the best approximation of the target function (which takes into consideration all the points in the input space at once).

 Anne Paulson 02-01-2013 11:28 AM

Re: Q4) h(x) = ax

OK, and then the average value of *is* the expected value of .

All times are GMT -7. The time now is 04:20 PM.