LFD Book Forum  

Go Back   LFD Book Forum > Course Discussions > Online LFD course > Homework 7

Reply
 
Thread Tools Display Modes
  #1  
Old 05-17-2012, 04:03 PM
kurts kurts is offline
Invited Guest
 
Join Date: Apr 2012
Location: Portland, OR
Posts: 70
Default Matlab solutions?

So far for all the programming exercises, I have used C (Objective-C, to be exact) to complete them. I'm afraid that for Homework 7, I'll have to use MatLab, and I really don't know how to program/use it properly. I think I'd rather go that way than find/learn to use a quadratic programming package for C, at least on short notice like for homework.

HW7 looks like it has the same kind of setup requirements (generate N random numbers from [-1,1], pick a random line to divide them. etc.) as many previous exercises.

I was wondering if someone who has used Matlab to complete the earlier homework problems that are similar to the ones in HW7 could post their solutions? I'm pretty sure I could figure out how to adapt it to HW7 if I had some examples to start with.

Thanks!
Reply With Quote
  #2  
Old 05-17-2012, 05:18 PM
dudefromdayton dudefromdayton is offline
Invited Guest
 
Join Date: Apr 2012
Posts: 140
Default Re: Matlab solutions?

I'm on the fence with this assignment. I purchased a Big Package last week, and I used it for the homework. It was my first departure from C for this course.

On this assignment, C would be more convenient for me, in the sense that I have other places I might plug my SVM into. I also think performance would be a lot better. I'm thinking that the GNU Scientific Library should have something we could use, which would keep us in the familiar realm of C.

I'll update this post as I find more out, but for the time being I have to feed several people.

UPDATE: GNU GSL doesn't have anything. Elsewhere, I'm not coming up with a practical package, and I *still* have to feed several people. Ciau!

UPDATE 2: (food's on the stove) A little insight with this problem lends itself to non-QP means of finding the support vectors. It won't generalize into real life, but you can at least answer the questions. It's not the route I plan to follow, but it's easily reachable.
Reply With Quote
  #3  
Old 05-17-2012, 07:02 PM
elkka elkka is offline
Invited Guest
 
Join Date: Apr 2012
Posts: 57
Default Re: Matlab solutions?

kurts, I am no expert in programming, and I only started to learn Octave (I understand it is fully compatible with Matlab) less than two months ago. My programs are not optimal and not elegant. But here is my code, if you want it.

The functions:
Code:
function f = GenerateF(a,b)
% a,b - to points in [-1,1]*[-1,1]
%f - linear function defined be these points, f = [x0,x1,x2]

   f(1) = a(2)*b(1)-a(1)*b(2);
   f(2) = b(2)-a(2);
   f(3) = -b(1)+a(1);
end;

function X = GenerateX(f,n)
% f is the target linear function in X^2 form [a_0,a_1,a_2]
%X is an array (3*n) of point data, consisting of uniform independent r.v. in [-1,1]*[-1,1]
%first two colomns are the coordinates, third - classification 
% in a hyperplane defined by f

   X = unifrnd(-1,1,n,2);
   X(:,3) = sign([ones(n,1),X]*f');
end;

function p = Difference(f,g,n)
% f, g - two linear functions in [-1,1]^2, in form [a_0,a_1,a_2]
% p = approximate P(sifference in classification b/n f and g)
% n - number of test points to approximate probability 

   X = unifrnd(-1,1,n,2);
   X =[ones(n,1),X];
   p = 0;
   for i = 1:n
	p = p+0.5*abs(sign(X(i,:)*f')-sign(X(i,:)*g'));
	i = i+1;
   end;
   p = p/n;
end;

function [p, itNum] = perceptron(DataSet,p0)
% DataSet is a n * 3 array of data, where each row corresponds 
%to a data point x, the first value being the first coordinate,
%the second being the second coordinate, and the third being the %classification
% p0 = initial value


k=1;
n = length(DataSet(:,1));
X = [ones(n,1),DataSet];
m = length(X(1,:));
p = p0; 
k=1;
i=1;

MaxDiff = 2;
while (MaxDiff>0)&(k<30000)
	[Diff,ind] = sort(abs(sign(X*p')-X(:,m)));
	[MaxDiff,iMax] = max(Diff);
	if (MaxDiff>0)
		i = ceil(unifrnd(iMax-1,n,1,1));
		p = p+X(ind(i),m)*X(ind(i),:);
		k = k+1;
		%if (mod(k,10000)==0);k end;
	end;
end;

itNum=k-1;
end;
Body for homework 1
Code:
a = unifrnd(-1,1,2,2);
%a = [0,0;1,1];
f = GenerateF(a(1,:),a(2,:))

n = 10000; % number of trials for the algorithm
m = 100; % number of points to run the algorithm on
k = 1000; % number of points to evaluate probability
Test = zeros(n,4);

%hw1 q6-10

for i = 1:n;
	X = GenerateX(f,m);
	[g ,Test(i,1)]= perceptron(X,[0,0,0]);
	Test(i,2) = Difference(f,g,k);
	i = i+1	
end;
iterations = mean(Test(:,1))
diff = mean(Test(:,2))
Reply With Quote
  #4  
Old 05-17-2012, 08:59 PM
dudefromdayton dudefromdayton is offline
Invited Guest
 
Join Date: Apr 2012
Posts: 140
Default Re: Matlab solutions?

There's definitely something to be said for getting out of C. The first half of homework 7 took all of 15 lines of code using a Big Package.
Reply With Quote
  #5  
Old 05-19-2012, 06:57 PM
kurts kurts is offline
Invited Guest
 
Join Date: Apr 2012
Location: Portland, OR
Posts: 70
Default Re: Matlab solutions?

Quote:
Originally Posted by elkka View Post
But here is my code, if you want it.
Thanks so much, elkka!

I eventually got it to run in Octave, and I even played around learning matrix and vector expressions, and got the linear regression method working on the same setup. I think I'm ready to tackle HW7!
Reply With Quote
  #6  
Old 05-19-2012, 10:56 PM
jsarrett jsarrett is offline
Member
 
Join Date: Apr 2012
Location: Sunland, CA
Posts: 13
Default Re: Matlab solutions?

A thought just occured to me; If you want to stay in C/C++ land, OpenCV has an SVM implementation you might be able to use.
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -7. The time now is 02:10 PM.


Powered by vBulletin® Version 3.8.3
Copyright ©2000 - 2019, Jelsoft Enterprises Ltd.
The contents of this forum are to be used ONLY by readers of the Learning From Data book by Yaser S. Abu-Mostafa, Malik Magdon-Ismail, and Hsuan-Tien Lin, and participants in the Learning From Data MOOC by Yaser S. Abu-Mostafa. No part of these contents is to be communicated or made accessible to ANY other person or entity.