View Single Post
  #1  
Old 07-23-2015, 12:20 AM
yongxien yongxien is offline
Junior Member
 
Join Date: Jun 2015
Posts: 8
Default Help in understanding proof for VC-dimension of perceptron.

https://work.caltech.edu/library/072.pdf

I am referring to the slides given in the link above.

I have a few questions regarding this proof:
1. Is the matrix invertible because of the way we construct it , such that it is lower triangular. If it is the case, I don't see why it does not work for d+2 or any other k > dimension of the perceptron.
2. Why the second part of the proof d + 1 >= d_vc not work for the case when k = d + 1 but only d +2?
3. I don't understand the statement why more points than dimension means we must have x_j = \sigma a_i x_i? (btw, d+1 is also more points than dimension)
Reply With Quote