Help in understanding proof for VCdimension of perceptron.
https://work.caltech.edu/library/072.pdf
I am referring to the slides given in the link above. I have a few questions regarding this proof: 1. Is the matrix invertible because of the way we construct it , such that it is lower triangular. If it is the case, I don't see why it does not work for d+2 or any other k > dimension of the perceptron. 2. Why the second part of the proof d + 1 >= d_vc not work for the case when k = d + 1 but only d +2? 3. I don't understand the statement why more points than dimension means we must have x_j = \sigma a_i x_i? (btw, d+1 is also more points than dimension) 
Re: Help in understanding proof for VCdimension of perceptron.
I don't understand the second part of the proof too. Why y_i = sign(w^T x_i) = sign(a_i).

Re: Help in understanding proof for VCdimension of perceptron.
Context:
To show that we need to prove we cannot shatter any set od points. In this case we have vectors of , where , hence we have vector than the total number of dimensions (our space is dimensional). According to the theory of vector spaces when we have more vectors than dimensions of the space then they are linearly dependent (one of the vectors is a linear combination of the others). Which means that for instance our extra point , can be computed by the equation (in dimensional space): , (where not all coefficients are zeros). We need to show that we can't get ( ) dichotomies, where is of size. Problem: And now let me define my problem I have with understanding the video proof. Why do we correlate coefficient with the while they don't correlate? On what basis we can draw that , and that's why , hence we can't generate ? 
Re: Help in understanding proof for VCdimension of perceptron.
Quote:
For any data set of size d + 2, we must have linear dependence, and the question is: With such inevitable linear dependence, can we find at least a specific data set that can be implemented 2^N dichotomies? The video lecture shows that for any data set of size d + 2, there are some dichotomies (specific to the data set) that the perceptron cannot implement, hence there's no such a data set of size d + 2 can be shattered by the perceptron hypothesis set. The proof tries to consider some dichotomies (specific to the data set) have two following properties:  with nonzero get .  gets . Hope this helps. 
Re: Help in understanding proof for VCdimension of perceptron.
Quote:
Quote:

Re: Help in understanding proof for VCdimension of perceptron.
Quote:
If the perceptron hypothesis set can shatter the data set then for the same , there exists and such that and . It means that if the data set is shatterable by the perceptron hypothesis set, for any in the data set, we can choose the case to be or and be sure that there exists at least one hypothesis from perceptron hypothesis set can implement the case we have chosen. Instead of choosing or explicitly, the proof let the choice for the value of depends on the value of : , because for whatever the real value of is, only has value of 1 or 1, hence or . In my understanding, this dependence does not make the chosen dichotomy invalid. 
Re: Help in understanding proof for VCdimension of perceptron.
Quote:

Re: Help in understanding proof for VCdimension of perceptron.
Quote:
If the perceptron hypothesis set can shatter the data set then: If for every that then there exists such a that satisfies for every that . In other words, we first have for every that , then we must be able to find a that satisfies the equation if the perceptron hypothesis set can shatter the data set. 
Re: Help in understanding proof for VCdimension of perceptron.
Quote:

All times are GMT 7. The time now is 05:58 AM. 
Powered by vBulletin® Version 3.8.3
Copyright ©2000  2020, Jelsoft Enterprises Ltd.
The contents of this forum are to be used ONLY by readers of the Learning From Data book by Yaser S. AbuMostafa, Malik MagdonIsmail, and HsuanTien Lin, and participants in the Learning From Data MOOC by Yaser S. AbuMostafa. No part of these contents is to be communicated or made accessible to ANY other person or entity.