LFD Book Forum Example 2.2 (.3) - sample randomness

#1
11-13-2018, 02:33 AM
 Yarduza Junior Member Join Date: Nov 2018 Posts: 1
Example 2.2 (.3) - sample randomness

We stated that for Hoeffding's inequality to be valid, it's important that the sample from the "bin" will be random - the E(in).
In example 2.2.3 (Convex set, page 44), it's stated that we choose the sample data to be on the perimeter of a circle (as stated, we need to choose the N points carefully).

By choosing the N points that way, (or by using any other careful way), don't we mess with the randomness of the sample?
Is it possible that we can't use Hoeffding's inequality following this process at all?
#2
11-18-2018, 07:20 PM
 htlin NTU Join Date: Aug 2009 Location: Taipei, Taiwan Posts: 601
Re: Example 2.2 (.3) - sample randomness

The discussion of #dichotomies focuses on what the "worst" number of dichotomies is. Then, when data is sampled (as Hoeffding needs), the number of dichotomies would be no more than the worst case (as discussed with the growth functions). If we can manage to bound the growth functions, we can also bound the "actual # of dichotomies when data is sampled."

Hope this helps.
__________________
When one teaches, two learn.

 Tags hoeffding's inequality, randomness, sample

 Thread Tools Display Modes Linear Mode

 Posting Rules You may not post new threads You may not post replies You may not post attachments You may not edit your posts BB code is On Smilies are On [IMG] code is On HTML code is Off Forum Rules
 Forum Jump User Control Panel Private Messages Subscriptions Who's Online Search Forums Forums Home General     General Discussion of Machine Learning     Free Additional Material         Dynamic e-Chapters         Dynamic e-Appendices Course Discussions     Online LFD course         General comments on the course         Homework 1         Homework 2         Homework 3         Homework 4         Homework 5         Homework 6         Homework 7         Homework 8         The Final         Create New Homework Problems Book Feedback - Learning From Data     General comments on the book     Chapter 1 - The Learning Problem     Chapter 2 - Training versus Testing     Chapter 3 - The Linear Model     Chapter 4 - Overfitting     Chapter 5 - Three Learning Principles     e-Chapter 6 - Similarity Based Methods     e-Chapter 7 - Neural Networks     e-Chapter 8 - Support Vector Machines     e-Chapter 9 - Learning Aides     Appendix and Notation     e-Appendices

All times are GMT -7. The time now is 11:03 PM.