LFD Book Forum  

Go Back   LFD Book Forum > Book Feedback - Learning From Data > Chapter 4 - Overfitting

 
 
Thread Tools Display Modes
Prev Previous Post   Next Post Next
  #1  
Old 05-10-2012, 08:18 PM
jbaker jbaker is offline
Member
 
Join Date: Apr 2012
Posts: 11
Default Choice of regularization parameter

Will try and ponder my question from the lecture again, since I wasn't quite dextrous enough to get the point across in chat-room format.

I think the point I was missing is that in Fig. 4.7(b) (last graph on slide 21 of May 10 lecture), the stochastic noise is in fact fixed at zero. I was probably having flashbacks to Fig. 4.3(b), where it's a fixed non-zero value, in which case the behavior of E_out would depend on N as well as lambda, right? So I was wondering for what choice of N the graph was plotted, and how the behavior of the Q_f = { 15, 30, 100 } lines would change with N. And imagining that N=15 as in previous examples, it was surprising that regularization wouldn't help out when Q_f=15!

But with zero stochastic noise, the expected deterministic noise is just whatever it is, independent of N, as the fit is the same regardless of what random points you pick. Well, I suppose we'd better have N >Q_f, at least, or we're in trouble!

Have I got that right?
Reply With Quote
 

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -7. The time now is 09:55 AM.


Powered by vBulletin® Version 3.8.3
Copyright ©2000 - 2019, Jelsoft Enterprises Ltd.
The contents of this forum are to be used ONLY by readers of the Learning From Data book by Yaser S. Abu-Mostafa, Malik Magdon-Ismail, and Hsuan-Tien Lin, and participants in the Learning From Data MOOC by Yaser S. Abu-Mostafa. No part of these contents is to be communicated or made accessible to ANY other person or entity.