LFD Book Forum question about probability

#11
04-18-2012, 12:59 AM
 ivanku Junior Member Join Date: Apr 2012 Location: Ingå, Finland Posts: 5

Quote:
 Originally Posted by yaser This is approximately since . Numerically, .
Is it correct to assume that from the coins tossing simulation in homework 2 assignment one should be close to that limit (where we're tossing 1000 coins 10 times and is the fraction of heads obtained for the coin which had the minimum frequency of heads)?
#12
04-18-2012, 03:49 AM
 elkka Invited Guest Join Date: Apr 2012 Posts: 57

The law of big numbers states that the average $\nu_min$ is close to the $E{\nu_min}$.

$E\nu_min$ can be calculated directly for this experiment.
$P(\nu_min=0)$=0.623576
$P(\nu_min = 0.1)$ = 0.3764034
$P(\nu_min = 0.2)$ = 0.00002;
and $P(\nu_min>=0.3)=0$ for the purposes of calculating the mean.

Therefore, $E(\nu_min)$=0.037644, and the average proportion of heads for c_min should be close to this number.
#13
04-18-2012, 06:43 AM
 SamK52 Member Join Date: Apr 2012 Posts: 25

Allow me to format your post:

Quote:
 The law of big numbers states that the average is close to the . can be calculated directly for this experiment. and for the purposes of calculating the mean. Therefore, , and the average proportion of heads for should be close to this number.
#14
04-18-2012, 11:26 AM
 elkka Invited Guest Join Date: Apr 2012 Posts: 57

Quote:
 Originally Posted by SamK52 Allow me to format your post:
#15
04-18-2012, 01:57 PM
 yaser Caltech Join Date: Aug 2009 Location: Pasadena, California, USA Posts: 1,477

Quote:
 Originally Posted by elkka Please, allow me to ask how you did it?
http://book.caltech.edu/bookforum/sh...77&postcount=1
__________________
Where everyone thinks alike, no one thinks very much
#16
04-18-2012, 01:58 PM
 rohanag Invited Guest Join Date: Apr 2012 Posts: 94

how did you calculate those probability values? ( nu_min = 0, 0.1, 0.2 )
#17
04-19-2012, 04:10 AM
 elkka Invited Guest Join Date: Apr 2012 Posts: 57

Thank you, Professor.

This how I calculate the probabilities. Let - the number of heads for , and let be the number of heads in i-th experiment (out of 1000). Then, as Professor has shown previously,

Now, . Therefore,

Next,

Next,

The rest can be calculated directly too, but they are essenctially 0 for the purpose of calculating the mean.
#18
04-19-2012, 08:44 AM
 rohanag Invited Guest Join Date: Apr 2012 Posts: 94

thank you for the detailed explanation.
#19
07-13-2012, 05:55 PM
 nyxee Junior Member Join Date: Jul 2012 Posts: 1

the answer is very clear but how do we know when to use this not.

to clarify my question, if say P(ten heads)=p and P(not ten heads)=q (=1-p). why does using (p^1000) give the wrong answer?
#20
07-15-2012, 04:56 AM
 htlin NTU Join Date: Aug 2009 Location: Taipei, Taiwan Posts: 601

Quote:
 Originally Posted by nyxee the answer is very clear but how do we know when to use this not. to clarify my question, if say P(ten heads)=p and P(not ten heads)=q (=1-p). why does using (p^1000) give the wrong answer?
means the probability of getting ten heads in each of the independent random trials. Is that the event you are interested in?
__________________
When one teaches, two learn.

 Tags marble, probability, urn

 Thread Tools Display Modes Linear Mode

 Posting Rules You may not post new threads You may not post replies You may not post attachments You may not edit your posts BB code is On Smilies are On [IMG] code is On HTML code is Off Forum Rules
 Forum Jump User Control Panel Private Messages Subscriptions Who's Online Search Forums Forums Home General     General Discussion of Machine Learning     Free Additional Material         Dynamic e-Chapters         Dynamic e-Appendices Course Discussions     Online LFD course         General comments on the course         Homework 1         Homework 2         Homework 3         Homework 4         Homework 5         Homework 6         Homework 7         Homework 8         The Final         Create New Homework Problems Book Feedback - Learning From Data     General comments on the book     Chapter 1 - The Learning Problem     Chapter 2 - Training versus Testing     Chapter 3 - The Linear Model     Chapter 4 - Overfitting     Chapter 5 - Three Learning Principles     e-Chapter 6 - Similarity Based Methods     e-Chapter 7 - Neural Networks     e-Chapter 8 - Support Vector Machines     e-Chapter 9 - Learning Aides     Appendix and Notation     e-Appendices

All times are GMT -7. The time now is 05:52 AM.