While doing model hyperparameter optimization, why seed is being tunned ? #500
cahuja1992
started this conversation in
General
Replies: 1 comment
-
Hey @cahuja1992, If you want you can tune the random seed to scape bad local optima, by changing the parameters initialization. It also gives you the possibility to bootstrap the model's initialization by fixing different random seeds. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I can see that for most of the models, the seed is being tunned as well, can anyone help understand that why is that the case ?
Beta Was this translation helpful? Give feedback.
All reactions