LFD Book Forum  

Go Back   LFD Book Forum > Course Discussions > Online LFD course > Homework 2

Reply
 
Thread Tools Display Modes
  #1  
Old 06-09-2014, 05:53 PM
jacox jacox is offline
Junior Member
 
Join Date: Jun 2014
Posts: 1
Default *ANSWER* Homework 1 Question 3, 4

I was a little confused at first by what Q3 was asking, so I thought I would describe my reasoning here, in case anybody else had the same question.

So what the question is saying in layman's terms is:

Quote:
You have a hypothesis function, h(x) that you train on a target function, y=f(x), but it makes an error with probability mu (oh well, at least you tried ). Now, after finding h(x), you apply that to some noisy data set (real-world data).

Now, the probability that h(x) makes an error in noiseless data is mu and the probability that it doesn't is 1-mu.

In addition, the probability that you make an error, simply due to noise, is 1-lambda, with lambda probability that the noise produces no error.

Therefore, since these are binary functions, the probability that you actually make an error when you apply h(x) to a noisy version of f(x) is: "the probability that there is an error due to noise (1-lambda), AND no error due to the "deterministic" error (1-u) OR the probability that there is no error due to noise (lambda) AND there is a "deterministic" error (mu).

Note, the probability distributions for "mu" and "lambda" are statistically independent (this is the assumption).

Therefore: P_{error} = P_{noise error}*P_{no mu error} + P_{no noise error}*P_{mu error} = (1-lambda)*(1-mu) + lambda*mu.
For question 4, you can see if you set lambda = 0.5, that P_{error} reduces to 1/2, and mu drops out. Intuitively, what you are saying if lambda = 0.5 is that your noise is so bad, that half the time, you are making errors. Well, in this situation, you don't expect mu to influence the outcome because your data is already uniformly random.

Please feel free to chime in with corrections and comments as necessary.
Reply With Quote
  #2  
Old 06-10-2014, 05:53 PM
yaser's Avatar
yaser yaser is offline
Caltech
 
Join Date: Aug 2009
Location: Pasadena, California, USA
Posts: 1,474
Default Re: *ANSWER* Homework 1 Question 3, 4

Thank you for the nice explanation.
__________________
Where everyone thinks alike, no one thinks very much
Reply With Quote
  #3  
Old 10-18-2017, 02:20 PM
hhprogram hhprogram is offline
Junior Member
 
Join Date: Oct 2017
Posts: 3
Default Re: *ANSWER* Homework 1 Question 3, 4

Thanks for the post. Help me understand as I was unclear on number 3. I think the key is 'h' approximating 'y'. I saw that and just read it as still 'h' approximating 'f' but reading it like that discounts the noise introduced to the reading into 'y'
Reply With Quote
  #4  
Old 11-02-2017, 12:26 PM
Khalid Khalid is offline
Member
 
Join Date: Aug 2017
Posts: 13
Post Re: *ANSWER* Homework 1 Question 3, 4

Thanks for the explantion.
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -7. The time now is 01:08 PM.


Powered by vBulletin® Version 3.8.3
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
The contents of this forum are to be used ONLY by readers of the Learning From Data book by Yaser S. Abu-Mostafa, Malik Magdon-Ismail, and Hsuan-Tien Lin, and participants in the Learning From Data MOOC by Yaser S. Abu-Mostafa. No part of these contents is to be communicated or made accessible to ANY other person or entity.