LFD Book Forum Why 2^N Makes Learning Unfeasible
 User Name Remember Me? Password
 FAQ Calendar Mark Forums Read

 Thread Tools Display Modes
#1
04-21-2013, 07:56 AM
 nkatz Junior Member Join Date: Apr 2013 Posts: 4
Why 2^N Makes Learning Unfeasible

I was going to ask what was wrong with growth function since but I think I figured it out:
The right hand side of the modified Hoeffding is really which goes to 0 only if , which would require which is meaningless since Ein and Eout are between 0 and 1.
Is this argument correct?
#2
04-21-2013, 08:02 AM
 Michael Reach Senior Member Join Date: Apr 2013 Location: Baltimore, Maryland, USA Posts: 71
Re: Why 2^N Makes Learning Unfeasible

I think that's right. You want to be able to make small.

Note that this doesn't mean that learning is not feasible, only that this inequality won't help you prove that it is. There might be some other way to bound growth. The professor already hinted that there are sometimes more ways, based on an "average" growth function that works for "most" cases.
#3
04-21-2013, 08:51 PM
 jlaurentum Member Join Date: Apr 2013 Location: Venezuela Posts: 41
Re: Why 2^N Makes Learning Unfeasible

If , then you'd have on the right hand side. Any value of that makes the exponential denominator smaller than the numerator of 2 would make the whole expression bigger than 2, which is meaningless for a probabilistic bound. The case in point, any epsilon smaller than 0.83 makes this upper hoeffding bound useless. Of course, 0.83 is an useless value for epsilon because you want to be able to make epsilon as close to 0 as you want.

 Thread Tools Display Modes Linear Mode

 Posting Rules You may not post new threads You may not post replies You may not post attachments You may not edit your posts BB code is On Smilies are On [IMG] code is On HTML code is Off Forum Rules
 Forum Jump User Control Panel Private Messages Subscriptions Who's Online Search Forums Forums Home General     General Discussion of Machine Learning     Free Additional Material         Dynamic e-Chapters         Dynamic e-Appendices Course Discussions     Online LFD course         General comments on the course         Homework 1         Homework 2         Homework 3         Homework 4         Homework 5         Homework 6         Homework 7         Homework 8         The Final         Create New Homework Problems Book Feedback - Learning From Data     General comments on the book     Chapter 1 - The Learning Problem     Chapter 2 - Training versus Testing     Chapter 3 - The Linear Model     Chapter 4 - Overfitting     Chapter 5 - Three Learning Principles     e-Chapter 6 - Similarity Based Methods     e-Chapter 7 - Neural Networks     e-Chapter 8 - Support Vector Machines     e-Chapter 9 - Learning Aides     Appendix and Notation     e-Appendices

All times are GMT -7. The time now is 05:20 PM.

 Contact Us - LFD Book - Top