LFD Book Forum Q10 higher bound

#11
04-29-2013, 03:06 AM
 MindExodus Junior Member Join Date: Apr 2013 Posts: 3
Re: Q10 higher bound

sry about didn't notice that requirement
but the key to this problem is the same:

let me just put another example:

just think of H1 and H2 on
H1:

H2:

thus:
,
and can shatter all 3 point

Quote:
 Originally Posted by jforbes Bad news: the problem requires that each hypothesis set has a finite positive VC dimension, so this particular example does not guarantee that the maximum is greater than the sum of the VC dimension for the case we're being asked about. I'm still on the fence for this question - my first attempts at constructing an example similar to this where each hypothesis set had a VC dimension of 1 failed to shatter 3 points, but I'm not 100% sure yet that I can't get it to work.
#12
04-29-2013, 07:50 AM
 nkatz Junior Member Join Date: Apr 2013 Posts: 4
Re: Q10 higher bound

Here is another example for the upper bound:

* H1 takes any 1 point and sets it to + or -, with the other points--if any--getting the opposite sign
* H2 sets all to + or all to -

H1 can only shatter 1 point, and H2 can only shatter 1 point
Their union however can shatter 3 points
#13
04-29-2013, 04:44 PM
 MindExodus Junior Member Join Date: Apr 2013 Posts: 3
Re: Q10 higher bound

And this method can be generalizes!!

for with VC dimension

H1: hypothesis over
mapping these point to cases that at most points to +1 and other points to -1

H2: hypothesis over
mapping these point to cases that at most points to -1 and other points to +1
so there is at least +1 and other points to -1

easy to prove that
since
so can shatter all point

By proper extension with same idea, it can reach higher bound in k points cases

Quote:
 Originally Posted by MindExodus sry about didn't notice that requirement but the key to this problem is the same: let me just put another example: just think of H1 and H2 on H1: H2: thus: , and can shatter all 3 point
#14
04-29-2013, 05:04 PM
 Elroch Invited Guest Join Date: Mar 2013 Posts: 143
Re: Q10 higher bound

Simpler?

#15
04-29-2013, 07:19 PM
 jforbes Member Join Date: Apr 2013 Posts: 12
Re: Q10 higher bound

Quote:
 Originally Posted by MindExodus thus: , and can shatter all 3 point
MindExodus - very nice; I'm thoroughly convinced.

Quote:
 Originally Posted by nkatz Here is another example for the upper bound: * H1 takes any 1 point and sets it to + or -, with the other points--if any--getting the opposite sign * H2 sets all to + or all to - H1 can only shatter 1 point, and H2 can only shatter 1 point Their union however can shatter 3 points
nkatz, your example has me slightly puzzled. The only way H1 can fail to shatter 2 points is if there are exactly two points:
H1 contains +- and -+, but not -- or ++
Whereas if you give it three points, it contains
+--,
-+-,
--+,
++-,
+-+, and
-++.
Just looking at the first two points, now H1 contains all 4 combinations. How can this be if its VC dimension is only 1? I think it must be the case that , but I'd be unsurprised if I were missing something.
#16
04-30-2013, 03:13 AM
 Elroch Invited Guest Join Date: Mar 2013 Posts: 143
Re: Q10 higher bound

A useful little construction is this.

If you have two underlying disjoint sets of points and , and hypothesis sets and (where is the set of subsets of ), then you can can form a new hypothesis set with one hypothesis for each pair of hypotheses in and (in the obvious way). There is a simple relationship between the VC-dimensions of these three hypothesis sets, which can be used in examples.
#17
04-30-2013, 03:20 AM
 Elroch Invited Guest Join Date: Mar 2013 Posts: 143
Re: Q10 higher bound

Quote:
 Originally Posted by jforbes MindExodus - very nice; I'm thoroughly convinced. nkatz, your example has me slightly puzzled. The only way H1 can fail to shatter 2 points is if there are exactly two points: H1 contains +- and -+, but not -- or ++ Whereas if you give it three points, it contains +--, -+-, --+, ++-, +-+, and -++. Just looking at the first two points, now H1 contains all 4 combinations. How can this be if its VC dimension is only 1? I think it must be the case that , but I'd be unsurprised if I were missing something.
I presume he meant it was only one specific point that was being referred to, so is +-- and -++, nothing more
#18
04-30-2013, 06:14 AM
 Michael Reach Senior Member Join Date: Apr 2013 Location: Baltimore, Maryland, USA Posts: 71
Re: Q10 higher bound

Good work on this problem by several of you guys!
#19
04-30-2013, 12:05 PM
 jforbes Member Join Date: Apr 2013 Posts: 12
Re: Q10 higher bound

Quote:
 Originally Posted by Elroch I presume he meant it was only one specific point that was being referred to, so is +-- and -++, nothing more
Wouldn't it then be the case that the union of the hypothesis sets could only shatter two points?

I'm still curious about what went wrong if H1 does in fact include --+, -+-, ... Is this sort of hypothesis set disallowed for some reason?
#20
04-30-2013, 01:10 PM
 TTHotShot Junior Member Join Date: Apr 2013 Posts: 3
Re: Q10 higher bound

Quote:
 Originally Posted by OlivierB
Great work everyone - this question had me scratching my head until I found this post. The one thing I don't understand is the above statment. Am I missing something? It seems like it should be an inequality.

 Thread Tools Display Modes Linear Mode

 Posting Rules You may not post new threads You may not post replies You may not post attachments You may not edit your posts BB code is On Smilies are On [IMG] code is On HTML code is Off Forum Rules
 Forum Jump User Control Panel Private Messages Subscriptions Who's Online Search Forums Forums Home General     General Discussion of Machine Learning     Free Additional Material         Dynamic e-Chapters         Dynamic e-Appendices Course Discussions     Online LFD course         General comments on the course         Homework 1         Homework 2         Homework 3         Homework 4         Homework 5         Homework 6         Homework 7         Homework 8         The Final         Create New Homework Problems Book Feedback - Learning From Data     General comments on the book     Chapter 1 - The Learning Problem     Chapter 2 - Training versus Testing     Chapter 3 - The Linear Model     Chapter 4 - Overfitting     Chapter 5 - Three Learning Principles     e-Chapter 6 - Similarity Based Methods     e-Chapter 7 - Neural Networks     e-Chapter 8 - Support Vector Machines     e-Chapter 9 - Learning Aides     Appendix and Notation     e-Appendices

All times are GMT -7. The time now is 03:27 PM.