LFD Book Forum (http://book.caltech.edu/bookforum/index.php)
-   Homework 3 (http://book.caltech.edu/bookforum/forumdisplay.php?f=132)
-   -   Q5, K dependency on N? (http://book.caltech.edu/bookforum/showthread.php?t=3919)

 ctallardc 01-26-2013 07:50 PM

Q5, K dependency on N?

Only to clarify my thoughts in question 5.
k (break point) is a fix value for a given type of hypothesis (problem) .
k do not depend of N, it is bound to N. Correct?

Also there is 2 option for growth function:

1-if there is not a break point then 2-if there is break point , which is polynomial of degree k

So a grow function which it is polynomial with degree variable and depending of N could not be limited by because for big value of N the bound will not be valid.

Am I right?

 yaser 01-26-2013 09:14 PM

Re: Q5, K dependency on N?

Quote:
 Originally Posted by ctallardc (Post 9015) k (break point) is a fix value for a given type of hypothesis (problem) . k do not depend of N, it is bound to N. Correct?
You are correct, but I am not sure what you mean by the last part. The bound creates is on the growth function .

Quote:
 Also there is 2 option for growth function: 1-if there is not a break point then 2-if there is break point , which is polynomial of degree k
Correct.

 Suhas Patil 01-27-2013 12:49 AM

Re: Q5, K dependency on N?

didn't quite get this part. Does large N have impact on the bound?

Quote:
 Originally Posted by ctallardc (Post 9015) So a grow function which it is polynomial with degree variable and depending of N could not be limited by because for big value of N the bound will not be valid.

 yaser 01-27-2013 12:55 AM

Re: Q5, K dependency on N?

Quote:
 Originally Posted by Suhas Patil (Post 9022) Does large N have impact on the bound?
The bound is valid for all , and being polynomial in means being bounded by some fixed polynomial for all values of .

 Suhas Patil 01-27-2013 12:56 AM

Re: Q5, K dependency on N?

got it...thank you.

 gah44 01-27-2013 04:13 PM

Re: Q5, K dependency on N?

Much of computer science (computational complexity) has to do with algorithms that are exponential (hard) vs. polynomial (not so hard). A favorite example of an exponentially hard problem is the traveling salesman. As N increases to infinity, an exponential will eventually be greater than a polynomial, but in real problems N doesn't usually get so large. With three cities to visit, it isn't hard to find an optimal path. For small N, a polynomial can easily beat an exponential.

As I understand it here, the problem is different. As I understand it here, the problem is different. As long as the growth function is exponential, you can never learn. For coin flip, roulette wheels, and lotteries, no matter how many examples you have, they don't help in predicting what comes next.

If there is a break point, then past samples have useful predictive value. The higher the breakpoint, the more samples you need to make good predictions.

Blackjack with a finite number of cards has a breakpoint. Predictions on future cards become better as you see more cards played. Even so, the polynomial solution stays close to the exponential for quite a while.

At least that is the way I understand it.

 palmipede 01-28-2013 07:20 AM

Re: Q5, K dependency on N?

Speaking of computational complexity, some hard problems like boolean satisfiability (e.g. 3-SAT) have been observed to have easy solution times for the average random instances of the problem but is no practical upper bound on the solution times of the worst cases.

I wonder what does that mean for learning? Are there known cases of phase transition in learning time?

 yaser 01-28-2013 11:45 AM

Re: Q5, K dependency on N?

Quote:
 Originally Posted by palmipede (Post 9033) Speaking of computational complexity, some hard problems like boolean satisfiability (e.g. 3-SAT) have been observed to have easy solution times for the average random instances of the problem but is no practical upper bound on the solution times of the worst cases. I wonder what does that mean for learning? Are there known cases of phase transition in learning time?
An example in machine learning is that finding the global minimum in neural networks has been shown to be NP-hard, but simple algorithms like stochastic gradient descent (leading to backpropogation) with some heuristics work fine in practice to find good local minima.

 All times are GMT -7. The time now is 08:45 AM.