LFD Book Forum Am I understanding the analogy correctly?
 User Name Remember Me? Password
 Register FAQ Calendar Mark Forums Read

 Thread Tools Display Modes
#1
09-04-2018, 10:17 PM
 huyptruong Junior Member Join Date: Sep 2018 Posts: 1
Am I understanding the analogy correctly?

On page 20, the book tries to connect the bin model to the learning problem. Here's I understand:
1) Introduce a hypothesis h
2) We can now compare h(x) and f(x) for all x in X. Think of the bin is completely sealed so we don't see the colors, yet there are some red and some green data points.
3) Due to h, we are introducing some probability to the bin X.
4) Now grab N data points from the bin X and look at these data points. We know their colors now. If most of them are red, we know we need to fix our hypothesis h because it is extremely unlikely to get a lot of red if h is very close to f (asserted by the Hoeffding's inequality).

This makes a lot of sense to me. However, I also watch the video lecture 2 and it seems that my understanding is incorrect. It seems that the professor says the probability is introduced in order to generate the sample data points, not because of the hypothesis h. This is completely opposite to my understanding because I think that the probability comes from the fact that we introduce a hypothesis h.

Can someone please help me understand this? Thanks!s
#2
09-08-2018, 04:29 PM
 htlin NTU Join Date: Aug 2009 Location: Taipei, Taiwan Posts: 610
Re: Am I understanding the analogy correctly?

The hypothesis introduces the "color" of the marbles. This, together with the sampling (from the bin), defines the whole probability. Something like

P(color) = P(color | marble) * P(marble)

The first term on the RHS is defined by the hypothesis. The second term is defined by sampling.

Hope this helps.
__________________
When one teaches, two learn.

 Thread Tools Display Modes Linear Mode

 Posting Rules You may not post new threads You may not post replies You may not post attachments You may not edit your posts BB code is On Smilies are On [IMG] code is On HTML code is Off Forum Rules
 Forum Jump User Control Panel Private Messages Subscriptions Who's Online Search Forums Forums Home General     General Discussion of Machine Learning     Free Additional Material         Dynamic e-Chapters         Dynamic e-Appendices Course Discussions     Online LFD course         General comments on the course         Homework 1         Homework 2         Homework 3         Homework 4         Homework 5         Homework 6         Homework 7         Homework 8         The Final         Create New Homework Problems Book Feedback - Learning From Data     General comments on the book     Chapter 1 - The Learning Problem     Chapter 2 - Training versus Testing     Chapter 3 - The Linear Model     Chapter 4 - Overfitting     Chapter 5 - Three Learning Principles     e-Chapter 6 - Similarity Based Methods     e-Chapter 7 - Neural Networks     e-Chapter 8 - Support Vector Machines     e-Chapter 9 - Learning Aides     Appendix and Notation     e-Appendices

All times are GMT -7. The time now is 03:30 PM.

 Contact Us - LFD Book - Top

Powered by vBulletin® Version 3.8.3
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.
The contents of this forum are to be used ONLY by readers of the Learning From Data book by Yaser S. Abu-Mostafa, Malik Magdon-Ismail, and Hsuan-Tien Lin, and participants in the Learning From Data MOOC by Yaser S. Abu-Mostafa. No part of these contents is to be communicated or made accessible to ANY other person or entity.