LFD Book Forum

LFD Book Forum (http://book.caltech.edu/bookforum/index.php)
-   Homework 6 (http://book.caltech.edu/bookforum/forumdisplay.php?f=135)
-   -   HW6 - Q 2 thru 6 - make sure you apply abs() to the last 2 features! (http://book.caltech.edu/bookforum/showthread.php?t=4286)

catherine 05-14-2013 01:53 AM

HW6 - Q 2 thru 6 - make sure you apply abs() to the last 2 features!
 
hi everybody, I just lost 4 (!) points because of my printer not being able to properly handle the absolute brackets (||) surrounding the last 2 elements of the transformed input vector (|x1 - x2| and |x1 + x2|). So make sure to double-check any formulas against the online version of the homework instructions. Some of the less usual characters don't print well...

PaulN 05-14-2013 03:01 AM

Re: HW6 - Q 2 thru 6 - make sure you apply abs() to the last 2 features!
 
Oh no!

Hindsight is 20/20, as they say, but the "x1 + x2" and "x1 - x2" components (that is without taking the absolute value) would seem a little redundant. As linear combinations of x1 and x2, they wouldn't contribute to the "non-linearity" of the transformation (since x1 and x2 are already in the given transformation).

Anyway, sorry about those points...

catherine 05-14-2013 05:11 AM

Re: HW6 - Q 2 thru 6 - make sure you apply abs() to the last 2 features!
 
I have to admit that due to the time constraints and my current workload - I'm taking a number of courses in parallel - I mostly focus on getting the algorithm right for this particular type of exercise, and don't spend enough time scrutinizing the problem statement. Though I'm not sure I understand why "x1 + x2" and "x1 - x2" wouldn't contribute anything to the model - as they have separate weights assigned to them that get tuned separately from the weights assigned to the separate x and y features.

jlaurentum 05-14-2013 06:22 AM

Re: HW6 - Q 2 thru 6 - make sure you apply abs() to the last 2 features!
 
Hello Catherine:

I share your pain. Last week I lost 2 points because I used 0.1 for the value of eta when it should have been 0.01 (I missed that extra zero). In retrospect, the important thing is that we learn. I for one learned about the importance of choosing a proper learning rate (eta). You perhaps learned that including the terms without the absolute value brackets gave you some redundant variables, which in turn affected the possibility of getting the inverse matrix right... (?). Anyways, sorry about your lost points.

Michael Reach 05-14-2013 07:11 AM

Re: HW6 - Q 2 thru 6 - make sure you apply abs() to the last 2 features!
 
In case it's any comfort, I'll tell you a story from when I was actually attending Caltech. Must have been around 1979. Richard Feynman, may he rest in peace, decided to give a class to undergraduates, which he hadn't done for a while. Nuclear physics. Of course, we were all tremendously excited. As it happened, it didn't work out for me at all, for two reasons: 1) He just didn't care how many calculations he had to do; he liked calculations. He'd write these huge matrices of equations for the nuclear energy or something, and multiply and invert them on the board. For weeks. (This was before Mathematica, of course.) You'd come into the class the next time, and there were all those matrices still waiting on the board. Discouraging - at least for me. 2) The other thing was grades. We were still young enough to think that our grades meant something, and nervous about them. Feynman had a very clear attitude towards grades: anyone who wants to study physics should care about physics and not in the least about grades. He refused to discuss them in any way or form. I just couldn't deal with it and dropped the class. My loss, of course.
In the end, he gave every single person who stuck with the course a Pass, regardless of how well they did. A couple of my friends got notes on their finals like, You probably shouldn't take this next term.

catherine 05-14-2013 02:39 PM

Re: HW6 - Q 2 thru 6 - make sure you apply abs() to the last 2 features!
 
hi guys, thanks for all your good words :) I was not complaining, just trying to avoid other people the same experience. I'm here for the trip, which has been a lot of fun so far, and, yes, I've found that the exercises are invaluable in checking and improving one's understanding of the material. I clearly need to work on my maths. My grasp of the concept of orthogonal functions, which seems to be central here, is poor. One thing is sure, I found myself a new source of brain excitement that's going to keep me on my toes for quite a while :D

Elroch 05-14-2013 02:58 PM

Re: HW6 - Q 2 thru 6 - make sure you apply abs() to the last 2 features!
 
Quote:

Originally Posted by catherine (Post 10822)
I have to admit that due to the time constraints and my current workload - I'm taking a number of courses in parallel - I mostly focus on getting the algorithm right for this particular type of exercise, and don't spend enough time scrutinizing the problem statement. Though I'm not sure I understand why "x1 + x2" and "x1 - x2" wouldn't contribute anything to the model - as they have separate weights assigned to them that get tuned separately from the weights assigned to the separate x and y features.

The reason is because the weights on x1+x2 and x1-x2 could be replaced by adjusted weights on x1 and x2, with no loss of generality. This would just make the calculations simpler.


All times are GMT -7. The time now is 03:10 PM.

Powered by vBulletin® Version 3.8.3
Copyright ©2000 - 2019, Jelsoft Enterprises Ltd.
The contents of this forum are to be used ONLY by readers of the Learning From Data book by Yaser S. Abu-Mostafa, Malik Magdon-Ismail, and Hsuan-Tien Lin, and participants in the Learning From Data MOOC by Yaser S. Abu-Mostafa. No part of these contents is to be communicated or made accessible to ANY other person or entity.