LFD Book Forum  

Go Back   LFD Book Forum > Course Discussions > Online LFD course > Homework 3

Reply
 
Thread Tools Display Modes
  #1  
Old 01-26-2013, 08:50 PM
ctallardc ctallardc is offline
Junior Member
 
Join Date: Jan 2013
Posts: 9
Default Q5, K dependency on N?

Only to clarify my thoughts in question 5.
k (break point) is a fix value for a given type of hypothesis (problem) .
k do not depend of N, it is bound to N. Correct?

Also there is 2 option for growth function:

1-if there is not a break point then m_H(n) = 2^N
2-if there is break point m_H(n)<= \sum_{i=1}^{k-1} {{N}\choose{i}}, which is polynomial of degree k

So a grow function which it is polynomial with degree variable and depending of N could not be limited by \sum_{i=1}^{k-1} {{N}\choose{i}} because for big value of N the bound will not be valid.

Am I right?
Reply With Quote
  #2  
Old 01-26-2013, 10:14 PM
yaser's Avatar
yaser yaser is offline
Caltech
 
Join Date: Aug 2009
Location: Pasadena, California, USA
Posts: 1,477
Default Re: Q5, K dependency on N?

Quote:
Originally Posted by ctallardc View Post
k (break point) is a fix value for a given type of hypothesis (problem) .
k do not depend of N, it is bound to N. Correct?
You are correct, but I am not sure what you mean by the last part. The bound k creates is on the growth function m_{\cal H}(N).

Quote:
Also there is 2 option for growth function:

1-if there is not a break point then m_H(n) = 2^N
2-if there is break point m_H(n)<= \sum_{i=1}^{k-1} {{N}\choose{i}}, which is polynomial of degree k
Correct.
__________________
Where everyone thinks alike, no one thinks very much
Reply With Quote
  #3  
Old 01-27-2013, 01:49 AM
Suhas Patil Suhas Patil is offline
Senior Member
 
Join Date: Dec 2012
Posts: 57
Default Re: Q5, K dependency on N?

didn't quite get this part. Does large N have impact on the bound?

Quote:
Originally Posted by ctallardc View Post
So a grow function which it is polynomial with degree variable and depending of N could not be limited by \sum_{i=1}^{k-1} {{N}\choose{i}} because for big value of N the bound will not be valid.
Reply With Quote
  #4  
Old 01-27-2013, 01:55 AM
yaser's Avatar
yaser yaser is offline
Caltech
 
Join Date: Aug 2009
Location: Pasadena, California, USA
Posts: 1,477
Default Re: Q5, K dependency on N?

Quote:
Originally Posted by Suhas Patil View Post
Does large N have impact on the bound?
The bound is valid for all N, and being polynomial in N means being bounded by some fixed polynomial for all values of N.
__________________
Where everyone thinks alike, no one thinks very much
Reply With Quote
  #5  
Old 01-27-2013, 01:56 AM
Suhas Patil Suhas Patil is offline
Senior Member
 
Join Date: Dec 2012
Posts: 57
Default Re: Q5, K dependency on N?

got it...thank you.
Reply With Quote
  #6  
Old 01-27-2013, 05:13 PM
gah44 gah44 is offline
Invited Guest
 
Join Date: Jul 2012
Location: Seattle, WA
Posts: 153
Default Re: Q5, K dependency on N?

Much of computer science (computational complexity) has to do with algorithms that are exponential (hard) vs. polynomial (not so hard). A favorite example of an exponentially hard problem is the traveling salesman. As N increases to infinity, an exponential will eventually be greater than a polynomial, but in real problems N doesn't usually get so large. With three cities to visit, it isn't hard to find an optimal path. For small N, a polynomial can easily beat an exponential.

As I understand it here, the problem is different. As I understand it here, the problem is different. As long as the growth function is exponential, you can never learn. For coin flip, roulette wheels, and lotteries, no matter how many examples you have, they don't help in predicting what comes next.

If there is a break point, then past samples have useful predictive value. The higher the breakpoint, the more samples you need to make good predictions.

Blackjack with a finite number of cards has a breakpoint. Predictions on future cards become better as you see more cards played. Even so, the polynomial solution stays close to the exponential for quite a while.

At least that is the way I understand it.
Reply With Quote
  #7  
Old 01-28-2013, 08:20 AM
palmipede palmipede is offline
Member
 
Join Date: Jan 2013
Posts: 13
Default Re: Q5, K dependency on N?

Speaking of computational complexity, some hard problems like boolean satisfiability (e.g. 3-SAT) have been observed to have easy solution times for the average random instances of the problem but is no practical upper bound on the solution times of the worst cases.

I wonder what does that mean for learning? Are there known cases of phase transition in learning time?
Reply With Quote
  #8  
Old 01-28-2013, 12:45 PM
yaser's Avatar
yaser yaser is offline
Caltech
 
Join Date: Aug 2009
Location: Pasadena, California, USA
Posts: 1,477
Default Re: Q5, K dependency on N?

Quote:
Originally Posted by palmipede View Post
Speaking of computational complexity, some hard problems like boolean satisfiability (e.g. 3-SAT) have been observed to have easy solution times for the average random instances of the problem but is no practical upper bound on the solution times of the worst cases.

I wonder what does that mean for learning? Are there known cases of phase transition in learning time?
An example in machine learning is that finding the global minimum in neural networks has been shown to be NP-hard, but simple algorithms like stochastic gradient descent (leading to backpropogation) with some heuristics work fine in practice to find good local minima.
__________________
Where everyone thinks alike, no one thinks very much
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -7. The time now is 02:53 AM.


Powered by vBulletin® Version 3.8.3
Copyright ©2000 - 2019, Jelsoft Enterprises Ltd.
The contents of this forum are to be used ONLY by readers of the Learning From Data book by Yaser S. Abu-Mostafa, Malik Magdon-Ismail, and Hsuan-Tien Lin, and participants in the Learning From Data MOOC by Yaser S. Abu-Mostafa. No part of these contents is to be communicated or made accessible to ANY other person or entity.