 LFD Book Forum Subset & deterministic noise

#1
 apinde Member Join Date: Jul 2012 Posts: 12 Subset & deterministic noise

Related to Q1 in HW6 I used the following reasoning to say that there is no trend. Whats the falacy?

Construct the hypothesis set H as the union of two hypothesis sets H1 and H2
Let H1 have higher deterministic noise and H2 have lower
H will have deterministic noise between H1 and H2
Now let H' be the set H1. Clearly H' is a subset of H ans H' has higher deterministic noise than H
Similarly H' can be H2, and then H' has lower deterministic noise than H

Therefore there is no trend
#2
 Andrs Member Join Date: Jul 2012 Posts: 47 Re: Subset & deterministic noise

If H is the union of H1 and H2 and assuming that H1 differs from H2, the hypothesis set H should be at least as complex as H2 therfore it should have a lower deterministic noise.
Related to the question:
The basic charachteristic is that H' is a subset of H therefore H' contains fewer alternatives compared to H (less options less flexibility). That is, H' will not be able to fit into the target function f as well as H would have. That is, H' implies a higher "deterministic noise".
#3
 apinde Member Join Date: Jul 2012 Posts: 12 Re: Subset & deterministic noise

Another expression of deterministic noise is the expected value of the squared error between the mean hypothesis and f(x). The mean hypothesis when one has disjoint H1 and H2 should fall somewhere between the more complex and the more simple sets and the deterministic error should be somewhere between the deterministic errors of H1 and H2. It other words, the deterministic error of H is not just the lower of the two.
#4
 MLearning Senior Member Join Date: Jul 2012 Posts: 56 Re: Subset & deterministic noise

The argument that there is no trend seems to make sense to me. We are not told about the complexity of the hypothesis, what we are told is both H' and H contain a set of hypothesis functions which can be simple or complex.

I disagree with the assertion that fewer alternatives means less options and less complexity. A hypothesis set may have a few number of possible hypothesis functions but one of those functions can have the complexity to provide a good fit. Hence, less deterministic noise.
#5
 MLearning Senior Member Join Date: Jul 2012 Posts: 56 Re: Subset & deterministic noise

The argument that there is no trend seems to make sense to me. We are not told about the complexity of the hypothesis, what we are told is both H' and H contain a set of hypothesis functions which can be simple or complex.

I disagree with the assertion that fewer alternatives means less options and less complexity. A hypothesis set may have a few number of possible hypothesis functions but one of those functions can have the complexity to provide a good fit. Hence, less deterministic noise.
#6
 JohnH Member Join Date: Jul 2012 Posts: 43 Re: Subset & deterministic noise

The phrase "deterministic noise" refers to complexity in the target function that cannot be described by the hypothesis set. is analogous to a vocabulary with each being a word within that vocabulary. Removing words from the vocabulary, , diminishes expressive capability; i.e., there are fewer things that can be described.
#7
 MLearning Senior Member Join Date: Jul 2012 Posts: 56 Re: Subset & deterministic noise

@JohnH,

I see your point and your argument makes sense. However, what if those remaining volcabularies in H' are complex enough to completely characterize the target function. Remember the target function is fixed in this case.
#8
 JohnH Member Join Date: Jul 2012 Posts: 43 Re: Subset & deterministic noise

The question was about trending, not a specific case. The question could be posed in terms of expected outcome. Is more or less likely than to be sufficient to describe the target function?
#9
 ilya239 Senior Member Join Date: Jul 2012 Posts: 58 Re: Subset & deterministic noise

Quote:
 Originally Posted by MLearning @JohnH, I see your point and your argument makes sense. However, what if those remaining volcabularies in H' are complex enough to completely characterize the target function. Remember the target function is fixed in this case.
Deterministic noise will either be reduced or stay the same. So the change is zero with some probability and negative with some probability, so the expected change is negative. (What a hand-waving argument )

 Thread Tools Show Printable Version Email this Page Display Modes Linear Mode Switch to Hybrid Mode Switch to Threaded Mode Posting Rules You may not post new threads You may not post replies You may not post attachments You may not edit your posts BB code is On Smilies are On [IMG] code is On HTML code is Off Forum Rules
 Forum Jump User Control Panel Private Messages Subscriptions Who's Online Search Forums Forums Home General     General Discussion of Machine Learning     Free Additional Material         Dynamic e-Chapters         Dynamic e-Appendices Course Discussions     Online LFD course         General comments on the course         Homework 1         Homework 2         Homework 3         Homework 4         Homework 5         Homework 6         Homework 7         Homework 8         The Final         Create New Homework Problems Book Feedback - Learning From Data     General comments on the book     Chapter 1 - The Learning Problem     Chapter 2 - Training versus Testing     Chapter 3 - The Linear Model     Chapter 4 - Overfitting     Chapter 5 - Three Learning Principles     e-Chapter 6 - Similarity Based Methods     e-Chapter 7 - Neural Networks     e-Chapter 8 - Support Vector Machines     e-Chapter 9 - Learning Aides     Appendix and Notation     e-Appendices

All times are GMT -7. The time now is 08:39 AM.