![]() |
#1
|
|||
|
|||
![]()
The problem asks us to prove that the optimal column vector is in the opposite direction of the inverse of the Hessian times the gradient.
But didn't the chapter prove that the optimal column vector is in the opposite direction of the gradient? |
#2
|
||||
|
||||
![]() Quote:
Hope this helps. ![]()
__________________
When one teaches, two learn. |
#3
|
|||
|
|||
![]()
Could you possibly redescribe the Problem 3.17b for me? I don't quite understand the requirements of this question. What's the relation between it and the gradient descent algorithm for logistic regression of the textbook?
|
#4
|
||||
|
||||
![]()
In gradient descent we studied a similar problem: find the direction to move to minimize the error the most for a given step size. This direction was the negative gradient of
![]() Part (a) defined a function ![]() ![]() ![]() ![]() If you choose to use the gradient hint, the gradient of ![]() ![]()
__________________
Have faith in probability |
![]() |
Thread Tools | |
Display Modes | |
|
|