This is my code in Octave (it is not correct but maybe you could help me find what is wrong):
Code:
function [N, theta, Eout] = trainLogisticRegression(eta, mfunc, bfunc, numPointsPerEpoch)
% Initialize some useful values
N=0;
theta = zeros(3, 1);
theta_last = theta + 1;
while ((abs(theta-theta_last)>0.01)==1), %Iterate until convergence
N = N +1;
theta_last = theta;
%Generate points for the epoch
[X, y] = getRandomPoints(numPointsPerEpoch, mfunc, bfunc);
% Add intercept term
X = [ones(numPointsPerEpoch, 1) X];
%Gradient Descent
for i = 1:numPointsPerEpoch
e = y(i).*X(i, :)./(1+exp(y(i)*(theta'*X(i,:)')));
%Adjusting parameters given gradient
theta = theta + eta*e';
end;
end
%New set of points to calculate the error
[X, y] = getRandomPoints(numPointsPerEpoch, mfunc, bfunc);
% Add intercept term
X = [ones(numPointsPerEpoch, 1) X];
%Error measure
Eout = 1/numPointsPerEpoch*(sum(log( 1 + exp(-1*y.*(theta'*X')'))));
end
I run this 100 times and average N and Eout to get the requested answers.
Nonetheless, Iīm missing something that I just canīt quite pin out.
Any help is appreciated.