LFD Book Forum HW6-Q1 ambiguous/contradictory
 User Name Remember Me? Password
 Register FAQ Calendar Mark Forums Read

 Thread Tools Display Modes
#1
05-18-2013, 02:01 AM
 doneit Member Join Date: Jan 2013 Posts: 15
HW6-Q1 ambiguous/contradictory

I've checked my understanding of set-notation but am I correct in thinking the question says "H' is-a-subset of H", ie, less complex?

To me the most "general" case is having a large, possibly infinite, H where the bulk would be overly complex. So my (wrong) answer assumes that by reducing complexity of the set I'm reducing the chance of over-fitting. So "in general" I'm *reducing* the chance of deterministic noise. Of course after a certain point it will increase again, limited by a flat lined solution.

So why would assuming H is already less complex be more "general" that this? Additionally, pg 124 states over-fitting increases with complexity. I scratch my head to the point of baldness

Btw, I'm enjoying the video lectures immensely. It's like watch a soap opera with twists & turns, plots & sub-plots, and just when the situation seems impossible the magician doffs his top hat and pulls out another white rabbit I actually giggled at the cross-validation trick
#2
05-18-2013, 02:12 AM
 yaser Caltech Join Date: Aug 2009 Location: Pasadena, California, USA Posts: 1,478
Re: HW6-Q1 ambiguous/contradictory

Quote:
 Originally Posted by doneit I've checked my understanding of set-notation but am I correct in thinking the question says "H' is-a-subset of H", ie, less complex? To me the most "general" case is having a large, possibly infinite, H where the bulk would be overly complex. So my (wrong) answer assumes that by reducing complexity of the set I'm reducing the chance of over-fitting. So "in general" I'm *reducing* the chance of deterministic noise.
Your understanding is correct in that a subset will be generally less complex than the original set. I think the source of confusion is that overfitting is affected by both the level of noise and the complexity of the model, and for deterministic noise the complexity in turn affects the level of noise. The question, however, addresses deterministic noise only, not necessarily the overfitting effect that results from it. The part which is not accurate in the above quote is the last sentence that mixes the two.

Quote:
 Btw, I'm enjoying the video lectures immensely. It's like watch a soap opera with twists & turns, plots & sub-plots, and just when the situation seems impossible the magician doffs his top hat and pulls out another white rabbit I actually giggled at the cross-validation trick
If I had thought of that, I would have inserted some commercial breaks at the right moments.
__________________
Where everyone thinks alike, no one thinks very much
#3
05-18-2013, 02:46 AM
 doneit Member Join Date: Jan 2013 Posts: 15
Re: HW6-Q1 ambiguous/contradictory

Quote:
 Originally Posted by yaser The part which is not accurate in the above quote is the last sentence that mixes the two.
Ok, got it, thanks.

Quote:
 If I had thought of that, I would have inserted some commercial breaks at the right moments.

 Tags complexity, overfitting

 Thread Tools Display Modes Linear Mode

 Posting Rules You may not post new threads You may not post replies You may not post attachments You may not edit your posts BB code is On Smilies are On [IMG] code is On HTML code is Off Forum Rules
 Forum Jump User Control Panel Private Messages Subscriptions Who's Online Search Forums Forums Home General     General Discussion of Machine Learning     Free Additional Material         Dynamic e-Chapters         Dynamic e-Appendices Course Discussions     Online LFD course         General comments on the course         Homework 1         Homework 2         Homework 3         Homework 4         Homework 5         Homework 6         Homework 7         Homework 8         The Final         Create New Homework Problems Book Feedback - Learning From Data     General comments on the book     Chapter 1 - The Learning Problem     Chapter 2 - Training versus Testing     Chapter 3 - The Linear Model     Chapter 4 - Overfitting     Chapter 5 - Three Learning Principles     e-Chapter 6 - Similarity Based Methods     e-Chapter 7 - Neural Networks     e-Chapter 8 - Support Vector Machines     e-Chapter 9 - Learning Aides     Appendix and Notation     e-Appendices

All times are GMT -7. The time now is 06:22 PM.

 Contact Us - LFD Book - Top

Powered by vBulletin® Version 3.8.3
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.
The contents of this forum are to be used ONLY by readers of the Learning From Data book by Yaser S. Abu-Mostafa, Malik Magdon-Ismail, and Hsuan-Tien Lin, and participants in the Learning From Data MOOC by Yaser S. Abu-Mostafa. No part of these contents is to be communicated or made accessible to ANY other person or entity.