LFD Book Forum Q4) h(x) = ax
 User Name Remember Me? Password
 FAQ Calendar Mark Forums Read

 Thread Tools Display Modes
#11
02-01-2013, 12:33 PM
 yaser Caltech Join Date: Aug 2009 Location: Pasadena, California, USA Posts: 1,477
Re: Q4) h(x) = ax

Quote:
 Originally Posted by Anne Paulson OK, and then the average value of *is* the expected value of .
To be technically accurate, the average value (which is also the expected value) of is , which is short hand for saying that the average value of is for all .
__________________
Where everyone thinks alike, no one thinks very much
#12
02-02-2013, 10:17 PM
 Axonymous Junior Member Join Date: May 2012 Posts: 7
Re: Q4) h(x) = ax

I calculated what I think is the best approximation by minimizing the derivative over a of the integral of the sine function minus the line y=ax. When I compare this to the result of my simulation, there's a difference of about 30% between the two possible values for a.

I realize that it's reasonable to assume that \bar{g} won't be the best result (see minute 43 in lecture 8, comparing .20 to .21). But is anyone else getting a result that differs by so much?

#13
02-03-2013, 02:50 AM
 gah44 Invited Guest Join Date: Jul 2012 Location: Seattle, WA Posts: 153
Re: Q4) h(x) = ax

Quote:
 Originally Posted by Axonymous I calculated what I think is the best approximation by minimizing the derivative over a of the integral of the sine function minus the line y=ax. When I compare this to the result of my simulation, there's a difference of about 30% between the two possible values for a.
Which problem is that for?

Like the lecture and the book, you consider a best fit for two points (least squares), and then average over all sets of two points (but not two of the same point). Then a in this case, or (a,b) in the book case, is/are the average over all such pairs of points.

I might believe that is 30% different from the one you mention.

You could also minimize the integral of the square of the sin()-ax.
#14
02-03-2013, 02:53 PM
 Axonymous Junior Member Join Date: May 2012 Posts: 7
Re: Q4) h(x) = ax

I wanted confirmation of the result that I got using the method we are supposed to implement. So I derived the slope of the "best" line, shown to us in slide 11 of lecture 8. (Which also applies in our case because it goes through the origin.) I did this by minimizing the area in yellow on that slide. (You can actually see that slope is close to 1 from the slide.)

I was surprised that the answer I got for question 4 is so different from this "perfect" approximation line that was found by minimizing the integral. It stands to reason that it should vary a little, but there is quite a difference between the two values.
#15
02-03-2013, 05:20 PM
 sanbt Member Join Date: Jan 2013 Posts: 35
Re: Q4) h(x) = ax

Quote:
 Originally Posted by Axonymous I wanted confirmation of the result that I got using the method we are supposed to implement. So I derived the slope of the "best" line, shown to us in slide 11 of lecture 8. (Which also applies in our case because it goes through the origin.) I did this by minimizing the area in yellow on that slide. (You can actually see that slope is close to 1 from the slide.) I was surprised that the answer I got for question 4 is so different from this "perfect" approximation line that was found by minimizing the integral. It stands to reason that it should vary a little, but there is quite a difference between the two values.
So the slope of the best line is just the slope of the line passing through each 2 points you picked each time. (implies that Ein= 0) But then you need 2D integral to average over that expression, over [-1, 1] x [-1,1]. The result should be close to the simulation.
#16
02-04-2013, 10:21 AM
 Axonymous Junior Member Join Date: May 2012 Posts: 7
Re: Q4) h(x) = ax

Go to wolframalpha.com and ask for:

"derivative of integral of (sin(pi*x)-(a*x))^2 from -1 to 1 with respect to a"

Set the result equal to 0 and solve for a. This gives you the line that is the "best" approximation. (I believe.)

It is not the answer to question 4. It is the result that our simulation method hopefully gets close to.

What's interesting to me is how far from this value for a our simulation result is.
#17
02-04-2013, 04:06 PM
 gah44 Invited Guest Join Date: Jul 2012 Location: Seattle, WA Posts: 153
Re: Q4) h(x) = ax

Quote:
 Originally Posted by Axonymous Go to wolframalpha.com and ask for: "derivative of integral of (sin(pi*x)-(a*x))^2 from -1 to 1 with respect to a" (snip) What's interesting to me is how far from this value for a our simulation result is.
Don't forget that 1-(-1) is 2.
#18
02-04-2013, 08:25 PM
 sanbt Member Join Date: Jan 2013 Posts: 35
Re: Q4) h(x) = ax

I apologize for the mistake in previous post. The slope of the best line is the expression of (a) that minimize Ein (which is not 0 in this case). Then you can do 2D integral to find expectation of that.
#19
04-28-2013, 02:58 AM
 Moobb Junior Member Join Date: Apr 2013 Posts: 9
Re: Q4) h(x) = ax

I am lost at this.. The procedure I follow is as described above, but the answer doesn't seem right and from what I've read elsewhere it's a common mistake on this question.. any hints possible on what I might be missing? I am picking two points, getting the best hypothesis by minimising the squared error, repeating this a number of times and assuming the answer is the average value within these runs..
#20
04-28-2013, 03:58 AM
 yaser Caltech Join Date: Aug 2009 Location: Pasadena, California, USA Posts: 1,477
Re: Q4) h(x) = ax

Quote:
 Originally Posted by Moobb I am picking two points, getting the best hypothesis by minimising the squared error, repeating this a number of times and assuming the answer is the average value within these runs..
You are correct. This is the right procedure, where the answer you mention is and the values you are averaging are .
__________________
Where everyone thinks alike, no one thinks very much

 Thread Tools Display Modes Linear Mode

 Posting Rules You may not post new threads You may not post replies You may not post attachments You may not edit your posts BB code is On Smilies are On [IMG] code is On HTML code is Off Forum Rules
 Forum Jump User Control Panel Private Messages Subscriptions Who's Online Search Forums Forums Home General     General Discussion of Machine Learning     Free Additional Material         Dynamic e-Chapters         Dynamic e-Appendices Course Discussions     Online LFD course         General comments on the course         Homework 1         Homework 2         Homework 3         Homework 4         Homework 5         Homework 6         Homework 7         Homework 8         The Final         Create New Homework Problems Book Feedback - Learning From Data     General comments on the book     Chapter 1 - The Learning Problem     Chapter 2 - Training versus Testing     Chapter 3 - The Linear Model     Chapter 4 - Overfitting     Chapter 5 - Three Learning Principles     e-Chapter 6 - Similarity Based Methods     e-Chapter 7 - Neural Networks     e-Chapter 8 - Support Vector Machines     e-Chapter 9 - Learning Aides     Appendix and Notation     e-Appendices

All times are GMT -7. The time now is 02:00 PM.

 Contact Us - LFD Book - Top