View Single Post
Old 05-30-2013, 12:34 PM
Elroch Elroch is offline
Invited Guest
Join Date: Mar 2013
Posts: 143
Default Re: Gradient Descent on complex parameters (weights)

Originally Posted by Kais_M View Post
actually there is multiplication of complex numbers; one complex number is a parameter we are trying to optimize, the other is the data. The data is represented in the Fourier domain, that's why it's complex. When taking the derivative wrt the complex parameter and propagating it inside the formula for sum of squared errors you eventually have to take the derivative of the complex parameter multiplied by the complex data wrt the complex parameter... e.g. the complex parameters could be values in a transfer function, complex data is Fourier transform of real signal.
Sorry, I think my last post just confused the issue.

If you have f: (x, y) \rightarrow \mathbb R, you know everything about the function regardless of whether you think of (x, y) as a complex number or not.

Specifically, you know the relative value at one point to another and the minimum. So you can choose to forget it ever was a complex function, think of it as a real function and do the optimisation you want. This is enough, right?
Reply With Quote