Not clear how cross_validation is being implemented - requires more documentation #701
Replies: 3 comments
-
Hey @anonymouspapi, Here is a description of cross-validation: |
Beta Was this translation helpful? Give feedback.
-
Great thanks a lot I already double checked this and cross-referenced it with the core class but the implementation was a bit odd. It's not too clear and this is a pretty significant step for modelling purposes. Will look at the Contribution issues and Documentation Issues and see if I can add anything to support or add more examples. Thanks for your great work! |
Beta Was this translation helpful? Give feedback.
-
i did not understand it either! what is the difference betwteen nf.fit and nf.cross_validation? it is important ,and the docs are too difficult to understand |
Beta Was this translation helpful? Give feedback.
-
Hi this is very important it is not clear how you are implementing the cross validation strategy - as you merge predictions from K models in the rolling K fold temporal cross-validation but it is not clear from the documentation of the code exactly how the final dataframe is being produced as there may be overlapping periods.
So it is not clear let's assume we have 5 folds, it is not clear which of the 5 models is being used for the final predictions let's say we have hourly data - then we may have more than overlapping model predictions for in-sample and out-of sample predictions.
It would be great if the documentation provided a clear diagram or something which specified exactly what you were doing for each fold?
Edit: Example is it fair to assume the unique 'cutoff' value is treated as a new fold, so you compute each model's performance on that fold separately?
Beta Was this translation helpful? Give feedback.
All reactions