View Single Post
  #3  
Old 07-23-2015, 02:20 AM
yaser's Avatar
yaser yaser is offline
Caltech
 
Join Date: Aug 2009
Location: Pasadena, California, USA
Posts: 1,477
Default Re: Help in understanding proof for VC-dimension of perceptron.

Quote:
Originally Posted by yongxien View Post
1. Is the matrix invertible because of the way we construct it , such that it is lower triangular. If it is the case, I don't see why it does not work for d+2 or any other k > dimension of the perceptron.
The number of columns is restricted to d+1 (length of the input vector) so the matrix would not be square if we had d+2 rows.

Quote:
2. Why the second part of the proof d + 1 >= d_vc not work for the case when k = d + 1 but only d +2?
Because if you have only d+1 vectors, they can be linearly independent. The inevitable linear dependence of any d+2 vectors is what makes that part of the proof work.

Quote:
3. I don't understand the statement why more points than dimension means we must have x_j = \sigma a_i x_i? (btw, d+1 is also more points than dimension)
This is a consequence of the basic linear algebra result mentioned in the response to point 2. As for the other remark, the length of the ector is d+1, so d+1 (because of the zeroth coordinate) would not be more points than dimensions.
__________________
Where everyone thinks alike, no one thinks very much
Reply With Quote