LFD Book Forum  

Go Back   LFD Book Forum > Course Discussions > Online LFD course > The Final

Reply
 
Thread Tools Display Modes
  #11  
Old 06-09-2013, 10:33 AM
sptripathi sptripathi is offline
Junior Member
 
Join Date: Apr 2013
Posts: 8
Default Re: Q20

Need help in verifying if below understanding is correct ?

The Bayesian:
P(h=f | D) = P(D | h=f) * P(h=f) / P(D)

For this Q, we are given:
P(h=f) is uniform in [0,1]
D: one-person-with-heart-attack
Pick f = c (constant)

To simplify, I assume that h and f are a discrete random-variables with 10 possible values from (0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1.0)
and each is equally likely with P=1/10. Essentially simplifying here to make P(h=f) a pmf which is actually a pdf.


Now:

P (D | h=f)
= Pr( one-person-with-heart-attack | h=f )
= Probability of one-person-with-heart-attack, given (h=f)
= c

( because if h=f were given, then the Prob of one picked person getting heart-attack is c, as defined by f )

Plug in above to get:
P(h=f | D) = c * P(h=f) / P(D)

Does above sound correct ?
Also P(D) =1 in this case ?

Thanks.
Reply With Quote
  #12  
Old 06-09-2013, 02:15 PM
Dorian Dorian is offline
Member
 
Join Date: Apr 2013
Posts: 11
Default Re: Q20

I find this exercise simple but very useful. If one thinks of the series of following measurements (1s and 0s for heart attack or not) one can clearly form an idea how this transforms step-by-step from a uniform distribution to a Bernoulli one.

Does this mean that this example represents one of those cases where the initial prior is irrelevant and we can safely use it for learning? Also, is this some form of reinforcement learning?

thanks,
Dorian.
Reply With Quote
  #13  
Old 06-09-2013, 05:23 PM
yaser's Avatar
yaser yaser is offline
Caltech
 
Join Date: Aug 2009
Location: Pasadena, California, USA
Posts: 1,476
Default Re: Q20

Quote:
Originally Posted by Dorian View Post
Does this mean that this example represents one of those cases where the initial prior is irrelevant and we can safely use it for learning? Also, is this some form of reinforcement learning?
In this case, with sufficient number of examples, the prior indeed fades away. Noisy examples blur the line between supervised and reinforcement learning somewhat as the information provided by the output is less definitive than in the noiseless case.
__________________
Where everyone thinks alike, no one thinks very much
Reply With Quote
  #14  
Old 06-10-2013, 08:36 PM
nkatz nkatz is offline
Junior Member
 
Join Date: Apr 2013
Posts: 4
Default Re: Q20

I am very confused by this problem. Perhaps this questions will help:
Is P(D|h=f) a function of D or h or both? It looks to me like it's a function of D, but we need to convert it to a function of h to get the posterior... Is this correct?
Reply With Quote
  #15  
Old 06-10-2013, 09:21 PM
yaser's Avatar
yaser yaser is offline
Caltech
 
Join Date: Aug 2009
Location: Pasadena, California, USA
Posts: 1,476
Default Re: Q20

Quote:
Originally Posted by nkatz View Post
Is P(D|h=f) a function of D or h or both?
Let us first clarify the notions. The data set {\cal D} has one data point in it which is either +1 (heart attack) or -1 (no heart attack). Being a function of {\cal D} means being a function of that value (\pm 1), so indeed P({\cal D}|h=f) is a function of {\cal D}, and it so happens in this problem that the value is fixed at +1. The probability P({\cal D}|h=f) is also a function of h (which happens to be the same as f according to what we are conditioning on).

Since {\cal D} is fixed, this leaves P({\cal D}|h=f) as a function of h only.
__________________
Where everyone thinks alike, no one thinks very much
Reply With Quote
  #16  
Old 06-11-2013, 03:46 AM
Elroch Elroch is offline
Invited Guest
 
Join Date: Mar 2013
Posts: 143
Default Re: Q20

There is an analogy that may be enlightening, which I thought of because of the presentation of the first part of this course.

Suppose you have a large number of urns each containing a large number of black and white balls in varying proportions. You are told how many urns there are with each proportion.

Then you go up to one of the urns and take out a ball which you find is black. The question is how likely it is that specific urn has each particular fraction of black balls.
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -7. The time now is 05:24 PM.


Powered by vBulletin® Version 3.8.3
Copyright ©2000 - 2019, Jelsoft Enterprises Ltd.
The contents of this forum are to be used ONLY by readers of the Learning From Data book by Yaser S. Abu-Mostafa, Malik Magdon-Ismail, and Hsuan-Tien Lin, and participants in the Learning From Data MOOC by Yaser S. Abu-Mostafa. No part of these contents is to be communicated or made accessible to ANY other person or entity.