![]() |
#1
|
|||
|
|||
![]()
Dear all, dear Professors,
I am trying to implement David MacKay's method of setting l2 regularizer in neural nets without need of cross-validation. It is briefly described in Geoffrey Hinton's lecture 5 and 6 of week 9, and in original papers (you can find all on the net): MacKay - A Practical Bayesian Framework for Backprop Networks MacKay - The Evidence Framework Applied to Classification Networks Approximation for regression problem is ![]() where ![]() ![]() The question is there similar approximation for classification problem? MacKay's 'The Evidence Framework Applied to Classification Networks', p.3 postulates the two frameworks are identical with only exception that ![]() ![]() ![]() ![]() ![]() Thank you in advance for your help! |
![]() |
Thread Tools | |
Display Modes | |
|
|