LFD Book Forum

LFD Book Forum (http://book.caltech.edu/bookforum/index.php)
-   Homework 2 (http://book.caltech.edu/bookforum/forumdisplay.php?f=131)
-   -   *ANSWER* Homework 1 Question 3, 4 (http://book.caltech.edu/bookforum/showthread.php?t=4485)

jacox 06-09-2014 05:53 PM

*ANSWER* Homework 1 Question 3, 4
 
I was a little confused at first by what Q3 was asking, so I thought I would describe my reasoning here, in case anybody else had the same question.

So what the question is saying in layman's terms is:

Quote:

You have a hypothesis function, h(x) that you train on a target function, y=f(x), but it makes an error with probability mu (oh well, at least you tried :)). Now, after finding h(x), you apply that to some noisy data set (real-world data).

Now, the probability that h(x) makes an error in noiseless data is mu and the probability that it doesn't is 1-mu.

In addition, the probability that you make an error, simply due to noise, is 1-lambda, with lambda probability that the noise produces no error.

Therefore, since these are binary functions, the probability that you actually make an error when you apply h(x) to a noisy version of f(x) is: "the probability that there is an error due to noise (1-lambda), AND no error due to the "deterministic" error (1-u) OR the probability that there is no error due to noise (lambda) AND there is a "deterministic" error (mu).

Note, the probability distributions for "mu" and "lambda" are statistically independent (this is the assumption).

Therefore: P_{error} = P_{noise error}*P_{no mu error} + P_{no noise error}*P_{mu error} = (1-lambda)*(1-mu) + lambda*mu.
For question 4, you can see if you set lambda = 0.5, that P_{error} reduces to 1/2, and mu drops out. Intuitively, what you are saying if lambda = 0.5 is that your noise is so bad, that half the time, you are making errors. Well, in this situation, you don't expect mu to influence the outcome because your data is already uniformly random.

Please feel free to chime in with corrections and comments as necessary.

yaser 06-10-2014 05:53 PM

Re: *ANSWER* Homework 1 Question 3, 4
 
Thank you for the nice explanation.

hhprogram 10-18-2017 02:20 PM

Re: *ANSWER* Homework 1 Question 3, 4
 
Thanks for the post. Help me understand as I was unclear on number 3. I think the key is 'h' approximating 'y'. I saw that and just read it as still 'h' approximating 'f' but reading it like that discounts the noise introduced to the reading into 'y'


All times are GMT -7. The time now is 08:12 PM.

Powered by vBulletin® Version 3.8.3
Copyright ©2000 - 2019, Jelsoft Enterprises Ltd.
The contents of this forum are to be used ONLY by readers of the Learning From Data book by Yaser S. Abu-Mostafa, Malik Magdon-Ismail, and Hsuan-Tien Lin, and participants in the Learning From Data MOOC by Yaser S. Abu-Mostafa. No part of these contents is to be communicated or made accessible to ANY other person or entity.