I agree that this lecture (and to a lesser degree "Regularization"), is more difficult than previous ones. There is quite a bit of Math involved and, since not everything is proven, I had to take some (reasonable) leaps of faith.

The way I understood it, is that it is another Machine Learning method, but a "state-of-the-art" one, one that has a natural intuition ("biggest margin") and that, more importantly, works remarkably well in practice.

Regarding "Quadratic Programming", it is similar to "Linear Programming", which is nowadays relatively standard Algorithms material (It can be found in CLRS's "Introduction to Algorithms"). So, another leap of faith, I am not surprised that there are algorithms that can solve efficiently these optimization problems, and I feel OK using them as "black-boxes".

Finally, regarding "Validation", the 2 lectures just happen to be in the same week, but unrelated, "Validation" being grouped with last week's lectures.

Anyway, you are not the only one who finds the lecture challenging and who doesn't know beforehand what SVM's are