View Single Post
Old 05-30-2013, 08:25 AM
Elroch Elroch is offline
Invited Guest
Join Date: Mar 2013
Posts: 143
Default Re: Gradient Descent on complex parameters (weights)

Originally Posted by Kais_M View Post
thank you for the quick reply. I am using a real error measure, sum of squared errors, but it is a function of complex parameters. When deriving the equations for the error and the update rule for gradient descent you will hit a point -unless I'm making the same mistake every time- where you have to compute the derivative wrt a complex parameter. I do not have any intuition into that... seems that Dr Yaser is saying that you have to look at the complex parameter as a 2D vector of real numbers and compute derivative wrt that vector.. this why the # of parameters doubles. is this an "engineering" solution?? or is it really mathematically correct.. there seems to be much more to this than meets the eye..
Don't worry, it's just as simple as it appears. For this purpose, a complex parameter is simply two real parameters, since there is no multiplication by complex numbers involved.

z = x+iy \implies dz = dx + i dy
{\partial \over {\partial z}} = ({{\partial \over {\partial x}},{\partial \over {\partial y}}})
Reply With Quote