Quote:
Originally Posted by yongxien
1. Is the matrix invertible because of the way we construct it , such that it is lower triangular. If it is the case, I don't see why it does not work for d+2 or any other k > dimension of the perceptron.
|
The number of columns is restricted to

(length of the input vector) so the matrix would not be square if we had

rows.
Quote:
2. Why the second part of the proof d + 1 >= d_vc not work for the case when k = d + 1 but only d +2?
|
Because if you have only

vectors, they can be linearly independent. The inevitable linear dependence of any

vectors is what makes that part of the proof work.
Quote:
3. I don't understand the statement why more points than dimension means we must have x_j = \sigma a_i x_i? (btw, d+1 is also more points than dimension)
|
This is a consequence of the basic linear algebra result mentioned in the response to point 2. As for the other remark, the length of the ector is

, so

(because of the zeroth coordinate) would not be more points than dimensions.