LFD Book Forum Hw 6 q1
 User Name Remember Me? Password
 Register FAQ Calendar Mark Forums Read

 Thread Tools Display Modes
#11
05-12-2012, 03:59 AM
 lucifirm Member Join Date: Apr 2012 Posts: 20
Re: Hw 6 q1

Thanks you guys, I reviewed the lecture video and that cleared up my ideas.
#12
08-20-2012, 09:53 AM
 tx75074 Junior Member Join Date: Jul 2012 Posts: 8
Re: Hw 6 q1

"H prime a subset of H" Does mean any subset of H?
#13
08-20-2012, 01:58 PM
 yaser Caltech Join Date: Aug 2009 Location: Pasadena, California, USA Posts: 1,478
Re: Hw 6 q1

Quote:
 Originally Posted by tx75074 "H prime a subset of H" Does mean any subset of H?
means is a proper subset of (not equal to ). Other than that, it could be any subset.
__________________
Where everyone thinks alike, no one thinks very much
#14
05-10-2013, 05:49 AM
 jcmorales1564 Member Join Date: Apr 2013 Posts: 12
Re: Hw 6 q1

Lecture 11 (overfitting) has been my favorite to date. I can’t wait for Lecture 12 (regularization) and 13 (validation) to see how the issue of overfitting is tackled. I thought I was understanding the material; however, I read Q1 in HW6 and could not answer it outright. I realized that I am still somewhat confused and would appreciate some clarification.

I think that one of my issues is how the flow of the lectures slides into situations where the target function is known and then shifts into situations where the target function is not known (real-world cases). I am not stating this as a criticism. It is just that I still don’t know how to clearly “read the signals” that we are moving from one regime (f known) to the other (f not known). For example, to calculate variance and bias (deterministic noise), we need to know the target function. However, in real-world cases we don’t know the target function so it would be impossible to calculate the variance and bias. In Q1, it says that “f is fixed”. This is a case where f is known. I am unclear by what it means by f being fixed. Would not being fixed mean a “moving target”? Are variance and bias useful concepts in real-world cases or are they only of an academic nature, perhaps as a stepping stone to better understand the underlying concepts of machine learning?

I hope that these questions come out sounding right and that I will receive some responses. This issue of overfitting has been the most enlightening thing that I have learned in this course and I just wish to understand it really well.

Thank you.

Juan
#15
05-10-2013, 09:47 AM
 yaser Caltech Join Date: Aug 2009 Location: Pasadena, California, USA Posts: 1,478
Re: Hw 6 q1

Quote:
 Originally Posted by jcmorales1564 In Q1, it says that “f is fixed”. This is a case where f is known. I am unclear by what it means by f being fixed.
I understand how the cases where was excplicitly given can cause confusion, as this seems to go against the main premise of unknown target function. The best way to resolve this confusion is to assume that someone else knows what the target is, and they will use that information to evaluate different aspects of our learning process, but we ourselves don't not know what is as we try to learn it from the data.

Having said that, the notion of 'fixed' is different. Q1 describes two learning processes (with two different hypothesis sets) and asserts that both processes are trying to learn the same target. That target can be unknown to both of them, but it is the same target and that is what makes it fixed. The point of having fixed here is that deterministic noise depends on more than one component in a learning situation, and by fixing the target function we take out one of these dependencies.
__________________
Where everyone thinks alike, no one thinks very much
#16
05-11-2013, 02:27 AM
 jcmorales1564 Member Join Date: Apr 2013 Posts: 12
Re: Hw 6 q1

Got it! Thank you, professor.
#17
05-14-2013, 02:07 PM
 Michael Reach Senior Member Join Date: Apr 2013 Location: Baltimore, Maryland, USA Posts: 71
Re: Hw 6 q1

Just want to check if I have the idea right here:
Deterministic noise means the bias, the difference between the correct target hypothesis, and the possible hypotheses for this hypothesis set. If H' is smaller than H, so it will be in general be less able to get close to the target hypothesis and the deterministic noise will be bigger. At least it can't be less.

However, though the noise is bigger, there is another effect that will often work in the opposite direction. The larger hypothesis set may give us the dubious ability to fit the deterministic noise better. Since we have more hypotheses to choose from, we may fit more of the noise with the larger hypothesis set, and end up worse off.

Does that sound right?
#18
05-14-2013, 04:18 PM
 Elroch Invited Guest Join Date: Mar 2013 Posts: 143
Re: Hw 6 q1

Quote:
 Originally Posted by Michael Reach Just want to check if I have the idea right here: Deterministic noise means the bias, the difference between the correct target hypothesis, and the possible hypotheses for this hypothesis set. If H' is smaller than H, so it will be in general be less able to get close to the target hypothesis and the deterministic noise will be bigger. At least it can't be less. However, though the noise is bigger, there is another effect that will often work in the opposite direction. The larger hypothesis set may give us the dubious ability to fit the deterministic noise better. Since we have more hypotheses to choose from, we may fit more of the noise with the larger hypothesis set, and end up worse off. Does that sound right?
Well, you might want to check the precise definition of bias. And one of your conclusions.
#19
05-14-2013, 04:33 PM
 Elroch Invited Guest Join Date: Mar 2013 Posts: 143
Re: Hw 6 q1

Quote:
 Originally Posted by yaser I understand how the cases where was excplicitly given can cause confusion, as this seems to go against the main premise of unknown target function. The best way to resolve this confusion is to assume that someone else knows what the target is, and they will use that information to evaluate different aspects of our learning process, but we ourselves don't not know what is as we try to learn it from the data. Having said that, the notion of 'fixed' is different. Q1 describes two learning processes (with two different hypothesis sets) and asserts that both processes are trying to learn the same target. That target can be unknown to both of them, but it is the same target and that is what makes it fixed. The point of having fixed here is that deterministic noise depends on more than one component in a learning situation, and by fixing the target function we take out one of these dependencies.
It has indeed been a source of some discomfort that the phenomenon being studied depends on something that is fixed but unknown! As far as I can see, it is possible to be given the same data and use the same method, and to be overfitting with one target function, but underfitting with another.

This is what made me think when I first saw this issue that it was necessary to have some knowledge about the distribution of the possible functions in order to allow the possibility of assessing the quality of a particular machine learning algorithm for function approximation in a real application. However, I now believe that using the technique of cross-validation gives an objective way of studying out of sample performance for function approximation that should allow probabilistic conclusions roughly analogous to Hoeffding. [I am familiar with this technique from the optimization of hyperparameters when using SVMs]

One of the great things about doing this course is to get to grips with issues like this. In fact I was using the C hyperparameter without really knowing what it was before we got to regularization in the lectures! I hope I've got the right end of the stick now.

 Thread Tools Display Modes Linear Mode

 Posting Rules You may not post new threads You may not post replies You may not post attachments You may not edit your posts BB code is On Smilies are On [IMG] code is On HTML code is Off Forum Rules
 Forum Jump User Control Panel Private Messages Subscriptions Who's Online Search Forums Forums Home General     General Discussion of Machine Learning     Free Additional Material         Dynamic e-Chapters         Dynamic e-Appendices Course Discussions     Online LFD course         General comments on the course         Homework 1         Homework 2         Homework 3         Homework 4         Homework 5         Homework 6         Homework 7         Homework 8         The Final         Create New Homework Problems Book Feedback - Learning From Data     General comments on the book     Chapter 1 - The Learning Problem     Chapter 2 - Training versus Testing     Chapter 3 - The Linear Model     Chapter 4 - Overfitting     Chapter 5 - Three Learning Principles     e-Chapter 6 - Similarity Based Methods     e-Chapter 7 - Neural Networks     e-Chapter 8 - Support Vector Machines     e-Chapter 9 - Learning Aides     Appendix and Notation     e-Appendices

All times are GMT -7. The time now is 02:57 PM.

 Contact Us - LFD Book - Top

Powered by vBulletin® Version 3.8.3
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.
The contents of this forum are to be used ONLY by readers of the Learning From Data book by Yaser S. Abu-Mostafa, Malik Magdon-Ismail, and Hsuan-Tien Lin, and participants in the Learning From Data MOOC by Yaser S. Abu-Mostafa. No part of these contents is to be communicated or made accessible to ANY other person or entity.