 LFD Book Forum Question about the constraints in Quadratic Programming

#1
 tadworthington Member Join Date: Jun 2012 Location: Chicago, IL Posts: 32 Question about the constraints in Quadratic Programming

So I'm trying to do the SVM problems, and I am running into a problem with the package I'm using (CVXOPT in Python) complaining that the rank of the 'A' matrix is less than p, where A is an nxp matrix.

The thing is, the rank of the matrix IS less than p - the 'A' matrix is . So is problematic, since is problematic. I don't understand QP well enough to understand why this matrix has to have 'full' rank, but it seems to. Anybody else running into this problem?
#2
 patrickjtierney Member Join Date: Jul 2012 Location: Toronto, Canada Posts: 33 Re: Question about the constraints in Quadratic Programming

That's odd. Wouldn't p=1 which is exactly the column rank of Y? Have you tried setting A=Y instead of the transpose. Just guessing though. I'm not familiar with that library.
#3
 jiunjiunma@gmail.com Junior Member Join Date: Jul 2012 Posts: 8 Re: Question about the constraints in Quadratic Programming

I am also using CVXOPT and python and didn't encounter this problem. However, the result I got from the solver was totally wrong. It even gave me some negative alphas. I suspect I might have set some parameter wrong but couldn't find it after hours of debugging . Frustrated, I am posting my small routine to create the qp parameters here to see if extra pairs of eyes help:

Code:
def getDualQuardraticParameters(trainingData, trainingResults):
n = len(trainingResults)
for i in range(n):
for j in range(n):
kernel = np.dot(trainingData[j], trainingData[i])
yjyi = trainingResults[j]*trainingResults[i]
P = matrix(quadraticCoefficients, (n, n), tc='d')
q = -1.0 * matrix(np.ones((n, 1)), tc='d')
G = -1.0 * matrix(np.identity(n), tc='d')
h = matrix(np.zeros((n, 1)), tc='d')
A = matrix(trainingResults, (1,n), tc='d')
b = matrix([0.0])

return P, q, G, h, A, b
#4
 tadworthington Member Join Date: Jun 2012 Location: Chicago, IL Posts: 32 Re: Question about the constraints in Quadratic Programming

Quote:
 Originally Posted by patrickjtierney That's odd. Wouldn't p=1 which is exactly the column rank of Y? Have you tried setting A=Y instead of the transpose. Just guessing though. I'm not familiar with that library.
I figured it out. It had nothing to do with 'A', and everything to do with a Python error I made #5
 tadworthington Member Join Date: Jun 2012 Location: Chicago, IL Posts: 32 Re: Question about the constraints in Quadratic Programming

Quote:
 Originally Posted by jiunjiunma@gmail.com I am also using CVXOPT and python and didn't encounter this problem. However, the result I got from the solver was totally wrong. It even gave me some negative alphas. I suspect I might have set some parameter wrong but couldn't find it after hours of debugging . Frustrated, I am posting my small routine to create the qp parameters here to see if extra pairs of eyes help: Code: def getDualQuardraticParameters(trainingData, trainingResults): n = len(trainingResults) quadraticCoefficients = [] for i in range(n): for j in range(n): kernel = np.dot(trainingData[j], trainingData[i]) yjyi = trainingResults[j]*trainingResults[i] quadraticCoefficients.append(yjyi * kernel) P = matrix(quadraticCoefficients, (n, n), tc='d') q = -1.0 * matrix(np.ones((n, 1)), tc='d') G = -1.0 * matrix(np.identity(n), tc='d') h = matrix(np.zeros((n, 1)), tc='d') A = matrix(trainingResults, (1,n), tc='d') b = matrix([0.0]) return P, q, G, h, A, b
You want to be careful what you give away in these forums...there is no *ANSWER* in the title of this post, so people who don't want too much information would not be happy seeing code posted here.
#6
 anachesa RPI Join Date: Jul 2012 Posts: 4 Re: Question about the constraints in Quadratic Programming

For more understanding of the math behind this machinery, the free book + lectures to which CVXOPT examples refer, might be useful (provided you have enough time to study it):
http://www.stanford.edu/~boyd/cvxbook/

 Tags hw7-8 Thread Tools Show Printable Version Email this Page Display Modes Linear Mode Switch to Hybrid Mode Switch to Threaded Mode Posting Rules You may not post new threads You may not post replies You may not post attachments You may not edit your posts BB code is On Smilies are On [IMG] code is On HTML code is Off Forum Rules
 Forum Jump User Control Panel Private Messages Subscriptions Who's Online Search Forums Forums Home General     General Discussion of Machine Learning     Free Additional Material         Dynamic e-Chapters         Dynamic e-Appendices Course Discussions     Online LFD course         General comments on the course         Homework 1         Homework 2         Homework 3         Homework 4         Homework 5         Homework 6         Homework 7         Homework 8         The Final         Create New Homework Problems Book Feedback - Learning From Data     General comments on the book     Chapter 1 - The Learning Problem     Chapter 2 - Training versus Testing     Chapter 3 - The Linear Model     Chapter 4 - Overfitting     Chapter 5 - Three Learning Principles     e-Chapter 6 - Similarity Based Methods     e-Chapter 7 - Neural Networks     e-Chapter 8 - Support Vector Machines     e-Chapter 9 - Learning Aides     Appendix and Notation     e-Appendices

All times are GMT -7. The time now is 05:34 PM. The contents of this forum are to be used ONLY by readers of the Learning From Data book by Yaser S. Abu-Mostafa, Malik Magdon-Ismail, and Hsuan-Tien Lin, and participants in the Learning From Data MOOC by Yaser S. Abu-Mostafa. No part of these contents is to be communicated or made accessible to ANY other person or entity.