Thread: Hw5 Q8 E_out
View Single Post
  #1  
Old 05-05-2013, 06:55 PM
marek marek is offline
Member
 
Join Date: Apr 2013
Posts: 31
Default Hw5 Q8 E_out

I am struggling to understand how to calculate E_{out} in this question. I have two competing theories, which I will describe below. Any help is greatly appreciated.

Once the algorithm terminates, I have w^{(t)}. I now generate a new set of data points \{X_i\}_{i=1}^M. Using my original target function to generate the corresponding Y_i = f(X_i).

Case 1. Just use the same cross entropy error calculation but on this new data set.

E_{out} = \frac{1}{M} \sum_{i=1}^M \ln (1+e^{-Y_i w^\top X_i})

Case 2. Directly calculate the expected output of our hypothesis function and compare to Y_i.

g(X_i) = +1 with probability \theta (w^\top X_i) = \frac{1}{1+e^{-w^\top X_i}}

Ultimately this gives us the probability that our hypothesis aligns with Y:

P(Y_i | X_i) = \theta(Y_i w^\top X_i)

In the lectures/book, we would multiply these probabilities to get the "likelihood" that the data was generated by this hypothesis. However, it seems that averaging over these should give the expected error in this sample.

E_{out} = \frac{1}{M} \sum_{i=1}^{M} (1-P(Y_i | X_i))

It feels as though the first approach is the correct one, but I struggle because the second approach makes intuitive sense since that is how I historically I would have calculated E_{out}. To make matters worse, the two approaches very closely approximate different answers in the question!
Reply With Quote