LFD Book Forum Q10 higher bound
 Register FAQ Calendar Mark Forums Read

#1
04-24-2013, 01:07 PM
 OlivierB Member Join Date: Apr 2013 Location: Paris Posts: 16
Q10 higher bound

I think I have found the tighter lower bound.
But I cannot determine the tighter higher bound, among the two candidates.

From the tighter lower bound that I have found and the information in the question, it seems possible to determine the tighter higher bound.

But I would like to either properly admit or dismiss the K-1 factor.
Can anybody more advanced on the subject propose an indication, a clue, a line of reasoning ? (NOT the answer)

Thx
#2
04-24-2013, 05:20 PM
 Michael Reach Senior Member Join Date: Apr 2013 Location: Baltimore, Maryland, USA Posts: 71
Re: Q10 higher bound

One helpful step is to check the problems in the book.
#3
04-24-2013, 06:03 PM
 yaser Caltech Join Date: Aug 2009 Location: Pasadena, California, USA Posts: 1,478
Re: Q10 higher bound

Quote:
 Originally Posted by OlivierB Can anybody more advanced on the subject propose an indication, a clue, a line of reasoning?
Let me just comment that this problem is probably the hardest in the entire course. A discussion from the participants is most welcome.
__________________
Where everyone thinks alike, no one thinks very much
#4
04-27-2013, 02:35 PM
 alasdairj Member Join Date: Mar 2013 Posts: 12
Re: Q10 higher bound

In 2d, a positive "2d ray" in the x direction can shatter 1 point, and a positive "2d ray" in the y direction can shatter 1 point, whereas their union can shatter 2 points...

So at least I can feel comfortable that the union of hypothesis sets *can* achieve a VC dimension which is the sum of the VC dimensions of its parts, but can it exceed it??
#5
04-28-2013, 11:12 AM
 marek Member Join Date: Apr 2013 Posts: 31
Re: Q10 higher bound

Quote:
 Originally Posted by alasdairj In 2d, a positive "2d ray" in the x direction can shatter 1 point, and a positive "2d ray" in the y direction can shatter 1 point, whereas their union can shatter 2 points... So at least I can feel comfortable that the union of hypothesis sets *can* achieve a VC dimension which is the sum of the VC dimensions of its parts, but can it exceed it??
I am struggling with this question as well. Based on this discussion, I can intuit what the right answer should be, but I am struggling with the justification.

I figure if I can simply come up with an H1 and H2 such that d(H1 U H2) = d(H1) + d(H2) + 1, I would be done. I can come up 1 short of that (much like your example, even consider 1d positive and negative rays). But I cannot think of an example that would hit the higher bound.

That being said, I decided to just play around with the growth function. We know that . If all the terms in that expansion are present, we get 2^N and hence can shatter. The binomial expansion is symmetric, so if I had two growth functions that were "disjoint" one could get the terms from the bottom and the other could get terms from the top and these could together give the full expansion.

More formally:. This gives the full expansion if the second sum starts off one term after the first sum ends, specifically .

This simple analysis gives me my desired . But clearly something is missing. How can growth functions be "disjoint"? Any example I come up with, there is a significant overlap in dichotomies. I feel like I'm definitely on the right track here, but I also feel like something is clearly missing. Maybe B(N,k) is a better starting point than the growth function?

Hopefully someone can pick up the ball and carry it over the finish line...
#6
04-28-2013, 02:20 PM
 OlivierB Member Join Date: Apr 2013 Location: Paris Posts: 16
Re: Q10 higher bound

Let me try and rephrase, not adding much.

For a hypothesis set H with VD dimension d, we have

The second equality is the result of 2 inequalities in opposite direction. In class, we saw '≤' and '≥' is the subject of problem 2.4 in the book.

Let 2 hypothesis sets and with VC dimensions respectively and .

We can construct two disjoint and putting the intersection of both with one of them only.

The VC dimension of is
The VC dimension of is

Let H be the union of these hypothesis sets:

Let

Now in order to try and determine the VC dimension of H, let us compute a higher bound of :

which we can simplify:

For to be true i.e. for H to shatter N, all inequalities must be equalities.
In other words, we must have:
1/ The growth functions of and must be exactly and
2/ Removing the intersection of and from does not decrease the VC dimension of . If the intersection is empty then this condition holds.

If these 2 requirements are not contradictory (which seems plausible but I cannot prove - neither can I visualize with an example), then the VC dimension of is at least .

Now unless there is a mistake in the reasoning, the question is: Can these requirements be met ? Ideally via an example, because beyond the abstract equations, I would really like to 'visualize' a case where these inequalities are 'saturated'.
#7
04-28-2013, 05:08 PM
 marek Member Join Date: Apr 2013 Posts: 31
Re: Q10 higher bound

I am still confused on how to piece everything together. I like your much cleaner version, but I do have one comment. Having disjoint hypothesis sets does not necessarily mean that the set of dichotomies they create will also be disjoint.

For example, let H1 and H2 be the positive and negative 1d rays, respectively. These two hypothesis sets are disjoint. However, given any data set they can both create the dichotomies of all +1 or all -1. They won't have much overlap beyond that (and maybe THAT is the point), but they won't be entirely disjoint as far as our inequalities are concerned.
#8
04-28-2013, 08:56 PM
 MindExodus Junior Member Join Date: Apr 2013 Posts: 3
Re: Q10 higher bound

Your requirement 2 can be simply rendered from Q9 that the VC dimension of intersection of hypothesis could not exceed any one of them(hope I got Q9 right)

As for requirement 1, I come up with the case that
H1 := mapping all points to +1
H2 := mapping all points to -1
thus

Quote:
 Originally Posted by OlivierB @marek, Thanks for your contribution. Let me try and rephrase, not adding much. ....... For to be true i.e. for H to shatter N, all inequalities must be equalities. In other words, we must have: 1/ The growth functions of and must be exactly and 2/ Removing the intersection of and from does not decrease the VC dimension of . If the intersection is empty then this condition holds. If these 2 requirements are not contradictory (which seems plausible but I cannot prove - neither can I visualize with an example), then the VC dimension of is at least . Now unless there is a mistake in the reasoning, the question is: Can these requirements be met ? Ideally via an example, because beyond the abstract equations, I would really like to 'visualize' a case where these inequalities are 'saturated'.
#9
04-30-2013, 01:10 PM
 TTHotShot Junior Member Join Date: Apr 2013 Posts: 3
Re: Q10 higher bound

Quote:
 Originally Posted by OlivierB
Great work everyone - this question had me scratching my head until I found this post. The one thing I don't understand is the above statment. Am I missing something? It seems like it should be an inequality.

 Thread Tools Display Modes Hybrid Mode

 Posting Rules You may not post new threads You may not post replies You may not post attachments You may not edit your posts BB code is On Smilies are On [IMG] code is On HTML code is Off Forum Rules
 Forum Jump User Control Panel Private Messages Subscriptions Who's Online Search Forums Forums Home General     General Discussion of Machine Learning     Free Additional Material         Dynamic e-Chapters         Dynamic e-Appendices Course Discussions     Online LFD course         General comments on the course         Homework 1         Homework 2         Homework 3         Homework 4         Homework 5         Homework 6         Homework 7         Homework 8         The Final         Create New Homework Problems Book Feedback - Learning From Data     General comments on the book     Chapter 1 - The Learning Problem     Chapter 2 - Training versus Testing     Chapter 3 - The Linear Model     Chapter 4 - Overfitting     Chapter 5 - Three Learning Principles     e-Chapter 6 - Similarity Based Methods     e-Chapter 7 - Neural Networks     e-Chapter 8 - Support Vector Machines     e-Chapter 9 - Learning Aides     Appendix and Notation     e-Appendices

All times are GMT -7. The time now is 11:23 PM.