You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have the 63 pages paper named "Training Deep and Recurrent Networks with Hessian-Free Optimization", the hessian-free mainly the reproduced trait in hessian-updated training and reduce most hessian memory with hessian-free training mode to accelerate much steps than SGD relevant methods.
According to the description of uis-rnn, that is a reproduced algorithm of RNN struct, so I recommend to add a feature of hessian-free mode to train compatible scale task in case of the prior of uis-rnn.
Randomly choose a method named Preconditioned Conjugate Gradient algorithm (PCG), this method located at page 9 of the requester's research paper "Training Deep and Recurrent Networks with Hessian-Free Optimization", which consists of The generalized Gaussian-Newton matrix, Damping, Preconditioning contents. Compare Gaussian-Newton and RNN, both are the algorithms with residual sequences and their evaluation.
There is a feature request of hessian-free support, due to the existed testing platform of uis-rnn. This produce report have no correlation of binded download version, only a helpful request of add hessian-free optimization when encounter large-scale residual datasets.
The text was updated successfully, but these errors were encountered:
I have the 63 pages paper named "Training Deep and Recurrent Networks with Hessian-Free Optimization", the hessian-free mainly the reproduced trait in hessian-updated training and reduce most hessian memory with hessian-free training mode to accelerate much steps than SGD relevant methods.
According to the description of uis-rnn, that is a reproduced algorithm of RNN struct, so I recommend to add a feature of hessian-free mode to train compatible scale task in case of the prior of uis-rnn.
Randomly choose a method named Preconditioned Conjugate Gradient algorithm (PCG), this method located at page 9 of the requester's research paper "Training Deep and Recurrent Networks with Hessian-Free Optimization", which consists of The generalized Gaussian-Newton matrix, Damping, Preconditioning contents. Compare Gaussian-Newton and RNN, both are the algorithms with residual sequences and their evaluation.
There is a feature request of hessian-free support, due to the existed testing platform of uis-rnn. This produce report have no correlation of binded download version, only a helpful request of add hessian-free optimization when encounter large-scale residual datasets.
The text was updated successfully, but these errors were encountered: