LFD Book Forum

LFD Book Forum (http://book.caltech.edu/bookforum/index.php)
-   Homework 6 (http://book.caltech.edu/bookforum/forumdisplay.php?f=135)
-   -   backpropagation at the final layer (http://book.caltech.edu/bookforum/showthread.php?t=1042)

jianpan 08-17-2012 05:56 PM

backpropagation at the final layer
 
since \delta_1^{(L)}=\frac{\partial e(w)}{\partial s_1^{(L)}}, and e(w)=(x_1^{(L)}-y_n)^2, and x_1^{(L)}=\theta(s_1^{(L)}), I got \delta_1^{(L)}=2*(x_1^{(L)}-y_n)*(1-\theta^2(s)). is this correct? I found a python program at wikipedia about backpropagation using the \delta_1^{(L)}=(x_1^{(L)}-y_n)*(1-\theta^2(s)), and couldn't figure out where the 2 was dropped. Can someone confirm my derivation is correct? Thanks.

yaser 08-17-2012 08:12 PM

Re: backpropagation at the final layer
 
Quote:

Originally Posted by jianpan (Post 4151)
since \delta_1^{(L)}=\frac{\partial e(w)}{\partial s_1^{(L)}}, and e(w)=(x_1^{(L)}-y_n)^2, and x_1^{(L)}=\theta(s_1^{(L)}), I got \delta_1^{(L)}=2*(x_1^{(L)}-y_n)*(1-\theta^2(s)). is this correct? I found a python program at wikipedia about backpropagation using the \delta_1^{(L)}=(x_1^{(L)}-y_n)*(1-\theta^2(s)), and couldn't figure out where the 2 was dropped. Can someone confirm my derivation is correct? Thanks.

Some people define the error with a factor of 1 \over 2 in it in anticipation of the differentiation, hence the discrepancy.

jianpan 08-18-2012 12:05 AM

Re: backpropagation at the final layer
 
Thanks for the clarification, Professor.


All times are GMT -7. The time now is 03:58 AM.

Powered by vBulletin® Version 3.8.3
Copyright ©2000 - 2019, Jelsoft Enterprises Ltd.
The contents of this forum are to be used ONLY by readers of the Learning From Data book by Yaser S. Abu-Mostafa, Malik Magdon-Ismail, and Hsuan-Tien Lin, and participants in the Learning From Data MOOC by Yaser S. Abu-Mostafa. No part of these contents is to be communicated or made accessible to ANY other person or entity.