LFD Book Forum HW6 - Q 2 thru 6 - make sure you apply abs() to the last 2 features!
 Register FAQ Calendar Mark Forums Read

#1
05-14-2013, 02:53 AM
 catherine Member Join Date: Apr 2013 Posts: 18
HW6 - Q 2 thru 6 - make sure you apply abs() to the last 2 features!

hi everybody, I just lost 4 (!) points because of my printer not being able to properly handle the absolute brackets (||) surrounding the last 2 elements of the transformed input vector (|x1 - x2| and |x1 + x2|). So make sure to double-check any formulas against the online version of the homework instructions. Some of the less usual characters don't print well...
#2
05-14-2013, 04:01 AM
 PaulN Junior Member Join Date: Apr 2013 Posts: 6
Re: HW6 - Q 2 thru 6 - make sure you apply abs() to the last 2 features!

Oh no!

Hindsight is 20/20, as they say, but the "x1 + x2" and "x1 - x2" components (that is without taking the absolute value) would seem a little redundant. As linear combinations of x1 and x2, they wouldn't contribute to the "non-linearity" of the transformation (since x1 and x2 are already in the given transformation).

#3
05-14-2013, 06:11 AM
 catherine Member Join Date: Apr 2013 Posts: 18
Re: HW6 - Q 2 thru 6 - make sure you apply abs() to the last 2 features!

I have to admit that due to the time constraints and my current workload - I'm taking a number of courses in parallel - I mostly focus on getting the algorithm right for this particular type of exercise, and don't spend enough time scrutinizing the problem statement. Though I'm not sure I understand why "x1 + x2" and "x1 - x2" wouldn't contribute anything to the model - as they have separate weights assigned to them that get tuned separately from the weights assigned to the separate x and y features.
#4
05-14-2013, 07:22 AM
 jlaurentum Member Join Date: Apr 2013 Location: Venezuela Posts: 41
Re: HW6 - Q 2 thru 6 - make sure you apply abs() to the last 2 features!

Hello Catherine:

I share your pain. Last week I lost 2 points because I used 0.1 for the value of eta when it should have been 0.01 (I missed that extra zero). In retrospect, the important thing is that we learn. I for one learned about the importance of choosing a proper learning rate (eta). You perhaps learned that including the terms without the absolute value brackets gave you some redundant variables, which in turn affected the possibility of getting the inverse matrix right... (?). Anyways, sorry about your lost points.
#5
05-14-2013, 08:11 AM
 Michael Reach Senior Member Join Date: Apr 2013 Location: Baltimore, Maryland, USA Posts: 71
Re: HW6 - Q 2 thru 6 - make sure you apply abs() to the last 2 features!

In case it's any comfort, I'll tell you a story from when I was actually attending Caltech. Must have been around 1979. Richard Feynman, may he rest in peace, decided to give a class to undergraduates, which he hadn't done for a while. Nuclear physics. Of course, we were all tremendously excited. As it happened, it didn't work out for me at all, for two reasons: 1) He just didn't care how many calculations he had to do; he liked calculations. He'd write these huge matrices of equations for the nuclear energy or something, and multiply and invert them on the board. For weeks. (This was before Mathematica, of course.) You'd come into the class the next time, and there were all those matrices still waiting on the board. Discouraging - at least for me. 2) The other thing was grades. We were still young enough to think that our grades meant something, and nervous about them. Feynman had a very clear attitude towards grades: anyone who wants to study physics should care about physics and not in the least about grades. He refused to discuss them in any way or form. I just couldn't deal with it and dropped the class. My loss, of course.
In the end, he gave every single person who stuck with the course a Pass, regardless of how well they did. A couple of my friends got notes on their finals like, You probably shouldn't take this next term.
#6
05-14-2013, 03:39 PM
 catherine Member Join Date: Apr 2013 Posts: 18
Re: HW6 - Q 2 thru 6 - make sure you apply abs() to the last 2 features!

hi guys, thanks for all your good words I was not complaining, just trying to avoid other people the same experience. I'm here for the trip, which has been a lot of fun so far, and, yes, I've found that the exercises are invaluable in checking and improving one's understanding of the material. I clearly need to work on my maths. My grasp of the concept of orthogonal functions, which seems to be central here, is poor. One thing is sure, I found myself a new source of brain excitement that's going to keep me on my toes for quite a while
#7
05-14-2013, 03:58 PM
 Elroch Invited Guest Join Date: Mar 2013 Posts: 143
Re: HW6 - Q 2 thru 6 - make sure you apply abs() to the last 2 features!

Quote:
 Originally Posted by catherine I have to admit that due to the time constraints and my current workload - I'm taking a number of courses in parallel - I mostly focus on getting the algorithm right for this particular type of exercise, and don't spend enough time scrutinizing the problem statement. Though I'm not sure I understand why "x1 + x2" and "x1 - x2" wouldn't contribute anything to the model - as they have separate weights assigned to them that get tuned separately from the weights assigned to the separate x and y features.
The reason is because the weights on x1+x2 and x1-x2 could be replaced by adjusted weights on x1 and x2, with no loss of generality. This would just make the calculations simpler.

 Thread Tools Display Modes Linear Mode

 Posting Rules You may not post new threads You may not post replies You may not post attachments You may not edit your posts BB code is On Smilies are On [IMG] code is On HTML code is Off Forum Rules
 Forum Jump User Control Panel Private Messages Subscriptions Who's Online Search Forums Forums Home General     General Discussion of Machine Learning     Free Additional Material         Dynamic e-Chapters         Dynamic e-Appendices Course Discussions     Online LFD course         General comments on the course         Homework 1         Homework 2         Homework 3         Homework 4         Homework 5         Homework 6         Homework 7         Homework 8         The Final         Create New Homework Problems Book Feedback - Learning From Data     General comments on the book     Chapter 1 - The Learning Problem     Chapter 2 - Training versus Testing     Chapter 3 - The Linear Model     Chapter 4 - Overfitting     Chapter 5 - Three Learning Principles     e-Chapter 6 - Similarity Based Methods     e-Chapter 7 - Neural Networks     e-Chapter 8 - Support Vector Machines     e-Chapter 9 - Learning Aides     Appendix and Notation     e-Appendices

All times are GMT -7. The time now is 12:03 AM.