-
Notifications
You must be signed in to change notification settings - Fork 97
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
as the for bpr loss function, is the expression right in the code? #27
Comments
Hi, the function Hope this helps, |
Thanks very much Alberto. And I just made a mistake, what i want to say is the lossderitive in the training presss,which is just as you mentioned in the update function. however the lossDerivative is not the same as the highlight part in the picture: http://photo27.hexun.com/p/2019/0201/632421/b_vip_11118BF2E45A1C8033DC6DA3576F7A2C.jpg i wonder whether we should use this expression: 1.0 / (1.0 + exp(scoreDifference)) or this expression: to calculate the loss deritive? |
Those two expressions are in fact equal :) |
Got it, Thanks very much Alberto. |
As for the loss function and updating the latent vector(parameter) ,
i find it is different implementment in the code and in the pater
in code:
Double BPREngine::loss(const Double scoreDifference) const {
return log(1.0 + exp(-scoreDifference));
}
in paper:
i just paste the rule here:
http://photo27.hexun.com/p/2019/0201/632421/b_vip_11118BF2E45A1C8033DC6DA3576F7A2C.jpg
http://photo27.hexun.com/p/2019/0201/632421/b_vip_11118BF2E45A1C8033DC6DA3576F7A2C.jpg
you can get the rule here
Θ = Θ + α( e^(-Xuij) / (1+ e ^(-Xuij) )........)
The text was updated successfully, but these errors were encountered: