LFD Book Forum

LFD Book Forum (http://book.caltech.edu/bookforum/index.php)
-   Homework 7 (http://book.caltech.edu/bookforum/forumdisplay.php?f=136)
-   -   Matlab solutions? (http://book.caltech.edu/bookforum/showthread.php?t=506)

kurts 05-17-2012 03:03 PM

Matlab solutions?
 
So far for all the programming exercises, I have used C (Objective-C, to be exact) to complete them. I'm afraid that for Homework 7, I'll have to use MatLab, and I really don't know how to program/use it properly. I think I'd rather go that way than find/learn to use a quadratic programming package for C, at least on short notice like for homework.

HW7 looks like it has the same kind of setup requirements (generate N random numbers from [-1,1], pick a random line to divide them. etc.) as many previous exercises.

I was wondering if someone who has used Matlab to complete the earlier homework problems that are similar to the ones in HW7 could post their solutions? I'm pretty sure I could figure out how to adapt it to HW7 if I had some examples to start with.

Thanks!

dudefromdayton 05-17-2012 04:18 PM

Re: Matlab solutions?
 
I'm on the fence with this assignment. I purchased a Big Package last week, and I used it for the homework. It was my first departure from C for this course.

On this assignment, C would be more convenient for me, in the sense that I have other places I might plug my SVM into. I also think performance would be a lot better. I'm thinking that the GNU Scientific Library should have something we could use, which would keep us in the familiar realm of C.

I'll update this post as I find more out, but for the time being I have to feed several people.

UPDATE: GNU GSL doesn't have anything. Elsewhere, I'm not coming up with a practical package, and I *still* have to feed several people. Ciau!

UPDATE 2: (food's on the stove) A little insight with this problem lends itself to non-QP means of finding the support vectors. It won't generalize into real life, but you can at least answer the questions. It's not the route I plan to follow, but it's easily reachable.

elkka 05-17-2012 06:02 PM

Re: Matlab solutions?
 
kurts, I am no expert in programming, and I only started to learn Octave (I understand it is fully compatible with Matlab) less than two months ago. My programs are not optimal and not elegant. But here is my code, if you want it.

The functions:
Code:


function f = GenerateF(a,b)
% a,b - to points in [-1,1]*[-1,1]
%f - linear function defined be these points, f = [x0,x1,x2]

  f(1) = a(2)*b(1)-a(1)*b(2);
  f(2) = b(2)-a(2);
  f(3) = -b(1)+a(1);
end;

function X = GenerateX(f,n)
% f is the target linear function in X^2 form [a_0,a_1,a_2]
%X is an array (3*n) of point data, consisting of uniform independent r.v. in [-1,1]*[-1,1]
%first two colomns are the coordinates, third - classification
% in a hyperplane defined by f

  X = unifrnd(-1,1,n,2);
  X(:,3) = sign([ones(n,1),X]*f');
end;

function p = Difference(f,g,n)
% f, g - two linear functions in [-1,1]^2, in form [a_0,a_1,a_2]
% p = approximate P(sifference in classification b/n f and g)
% n - number of test points to approximate probability

  X = unifrnd(-1,1,n,2);
  X =[ones(n,1),X];
  p = 0;
  for i = 1:n
        p = p+0.5*abs(sign(X(i,:)*f')-sign(X(i,:)*g'));
        i = i+1;
  end;
  p = p/n;
end;

function [p, itNum] = perceptron(DataSet,p0)
% DataSet is a n * 3 array of data, where each row corresponds
%to a data point x, the first value being the first coordinate,
%the second being the second coordinate, and the third being the %classification
% p0 = initial value


k=1;
n = length(DataSet(:,1));
X = [ones(n,1),DataSet];
m = length(X(1,:));
p = p0;
k=1;
i=1;

MaxDiff = 2;
while (MaxDiff>0)&(k<30000)
        [Diff,ind] = sort(abs(sign(X*p')-X(:,m)));
        [MaxDiff,iMax] = max(Diff);
        if (MaxDiff>0)
                i = ceil(unifrnd(iMax-1,n,1,1));
                p = p+X(ind(i),m)*X(ind(i),:);
                k = k+1;
                %if (mod(k,10000)==0);k end;
        end;
end;

itNum=k-1;
end;

Body for homework 1
Code:

a = unifrnd(-1,1,2,2);
%a = [0,0;1,1];
f = GenerateF(a(1,:),a(2,:))

n = 10000; % number of trials for the algorithm
m = 100; % number of points to run the algorithm on
k = 1000; % number of points to evaluate probability
Test = zeros(n,4);

%hw1 q6-10

for i = 1:n;
        X = GenerateX(f,m);
        [g ,Test(i,1)]= perceptron(X,[0,0,0]);
        Test(i,2) = Difference(f,g,k);
        i = i+1       
end;
iterations = mean(Test(:,1))
diff = mean(Test(:,2))


dudefromdayton 05-17-2012 07:59 PM

Re: Matlab solutions?
 
There's definitely something to be said for getting out of C. The first half of homework 7 took all of 15 lines of code using a Big Package.

kurts 05-19-2012 05:57 PM

Re: Matlab solutions?
 
Quote:

Originally Posted by elkka (Post 2189)
But here is my code, if you want it.

Thanks so much, elkka!

I eventually got it to run in Octave, and I even played around learning matrix and vector expressions, and got the linear regression method working on the same setup. I think I'm ready to tackle HW7!

jsarrett 05-19-2012 09:56 PM

Re: Matlab solutions?
 
A thought just occured to me; If you want to stay in C/C++ land, OpenCV has an SVM implementation you might be able to use.


All times are GMT -7. The time now is 10:31 PM.

Powered by vBulletin® Version 3.8.3
Copyright ©2000 - 2020, Jelsoft Enterprises Ltd.
The contents of this forum are to be used ONLY by readers of the Learning From Data book by Yaser S. Abu-Mostafa, Malik Magdon-Ismail, and Hsuan-Tien Lin, and participants in the Learning From Data MOOC by Yaser S. Abu-Mostafa. No part of these contents is to be communicated or made accessible to ANY other person or entity.