LFD Book Forum question about probability

#21
09-02-2012, 08:05 AM
 coolguy Junior Member Join Date: Sep 2012 Posts: 2

Quote:
 Originally Posted by yaser The probability of getting 10 heads for one coin is (10 times) which is aprroximately . Therefore, the probability of not getting 10 heads for one coin is approximately . This means that the probability of not getting 10 heads for any of 1000 coins is this number multiplied by itself 1000 times, once for every coin. This probability is therefore . This is approximately since . Numerically, . Therefore, the probability of this not happening, namely that at least one coin of the 1000 coins will give 10 heads, is 1 minus that. This gives us the answer of approximately 0.63 or 63% that I mentioned in the lecture.
WOW!Thanks for the detailed explanation!It was over my head for a long time.
#22
08-28-2013, 03:32 AM
 weehong Junior Member Join Date: Aug 2013 Posts: 2

Quote:
 Originally Posted by yaser The probability of getting 10 heads for one coin is (10 times) which is aprroximately . Therefore, the probability of not getting 10 heads for one coin is approximately . This means that the probability of not getting 10 heads for any of 1000 coins is this number multiplied by itself 1000 times, once for every coin. This probability is therefore . This is approximately since . Numerically, . Therefore, the probability of this not happening, namely that at least one coin of the 1000 coins will give 10 heads, is 1 minus that. This gives us the answer of approximately 0.63 or 63% that I mentioned in the lecture.
I captured the above idea, however I am confused why the following approach gives wrong answer:

Probability of all heads in 10 flips by one coin: p = 0.5^10
Probability of all heads in 10 flips by any of 1000 coin: 1000*p = 97.7%

In above I assumed there were 1000 10-flips.
#23
08-30-2013, 02:02 AM
 yaser Caltech Join Date: Aug 2009 Location: Pasadena, California, USA Posts: 1,477

Quote:
 Originally Posted by weehong I captured the above idea, however I am confused why the following approach gives wrong answer: Probability of all heads in 10 flips by one coin: p = 0.5^10 Probability of all heads in 10 flips by any of 1000 coin: 1000*p = 97.7% In above I assumed there were 1000 10-flips.
In order to multiply the probabilities in the "all" case, you need the events to be independent, and we have that for the coin flips. In order to add the probabilities in the "any of" case, you need the events to be disjoint, i.e., they cannot simultaneously occur. The first coin giving 10 heads is not disjoint from the second coin giving 10 heads, so when you add their probabilities you are double counting the overlap which is "both coins giving 10 heads each."
__________________
Where everyone thinks alike, no one thinks very much
#24
09-01-2013, 12:41 AM
 weehong Junior Member Join Date: Aug 2013 Posts: 2

Thank you very much, Professor. The explanation is spot on, very helpful.

 Tags marble, probability, urn

 Thread Tools Display Modes Linear Mode

 Posting Rules You may not post new threads You may not post replies You may not post attachments You may not edit your posts BB code is On Smilies are On [IMG] code is On HTML code is Off Forum Rules
 Forum Jump User Control Panel Private Messages Subscriptions Who's Online Search Forums Forums Home General     General Discussion of Machine Learning     Free Additional Material         Dynamic e-Chapters         Dynamic e-Appendices Course Discussions     Online LFD course         General comments on the course         Homework 1         Homework 2         Homework 3         Homework 4         Homework 5         Homework 6         Homework 7         Homework 8         The Final         Create New Homework Problems Book Feedback - Learning From Data     General comments on the book     Chapter 1 - The Learning Problem     Chapter 2 - Training versus Testing     Chapter 3 - The Linear Model     Chapter 4 - Overfitting     Chapter 5 - Three Learning Principles     e-Chapter 6 - Similarity Based Methods     e-Chapter 7 - Neural Networks     e-Chapter 8 - Support Vector Machines     e-Chapter 9 - Learning Aides     Appendix and Notation     e-Appendices

All times are GMT -7. The time now is 09:41 AM.