LFD Book Forum Minimizing Eaug- where did the 2 go?

#1
05-12-2012, 04:01 AM
 ladybird2012 Member Join Date: Apr 2012 Posts: 32
Minimizing Eaug- where did the 2 go?

Hi,
I think I'm missing something really trivial. In slide 11 of lecture 12, I don't see where the 2 went in differentiating lambda*(w^2). In other words, I feel setting Eaug =>0 should result in a 2*lambda*w term...What am I missing?

Thanks.
#2
05-12-2012, 04:08 AM
 yaser Caltech Join Date: Aug 2009 Location: Pasadena, California, USA Posts: 1,477
Re: Minimizing Eaug- where did the 2 go?

Quote:
 Originally Posted by ladybird2012 Hi, I think I'm missing something really trivial. In slide 11 of lecture 12, I don't see where the 2 went in differentiating lambda*(w^2). In other words, I feel setting Eaug =>0 should result in a 2*lambda*w term...What am I missing? Thanks.
Both terms have a 2 as a result of the differentiation, and the 2 goes away when we equate with zero.
__________________
Where everyone thinks alike, no one thinks very much
#3
05-12-2012, 04:13 AM
 ladybird2012 Member Join Date: Apr 2012 Posts: 32
Re: Minimizing Eaug- where did the 2 go?

Thanks for the quick response. I see it now and should have spotted it. Sorry to have bothered you about this.

 Thread Tools Display Modes Linear Mode

 Posting Rules You may not post new threads You may not post replies You may not post attachments You may not edit your posts BB code is On Smilies are On [IMG] code is On HTML code is Off Forum Rules
 Forum Jump User Control Panel Private Messages Subscriptions Who's Online Search Forums Forums Home General     General Discussion of Machine Learning     Free Additional Material         Dynamic e-Chapters         Dynamic e-Appendices Course Discussions     Online LFD course         General comments on the course         Homework 1         Homework 2         Homework 3         Homework 4         Homework 5         Homework 6         Homework 7         Homework 8         The Final         Create New Homework Problems Book Feedback - Learning From Data     General comments on the book     Chapter 1 - The Learning Problem     Chapter 2 - Training versus Testing     Chapter 3 - The Linear Model     Chapter 4 - Overfitting     Chapter 5 - Three Learning Principles     e-Chapter 6 - Similarity Based Methods     e-Chapter 7 - Neural Networks     e-Chapter 8 - Support Vector Machines     e-Chapter 9 - Learning Aides     Appendix and Notation     e-Appendices

All times are GMT -7. The time now is 09:22 PM.