You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am given a dataset that includes thousands of binary time series each with a hundert time points. Here is a toy example of this data:
ID t1 t2 t3 t4 ... t100
A 1 0 0 0 1
B 1 0 1 0 0
C 1 1 0 0 0
D 1 1 1 1 0
E 1 0 1 1 1
The goal is to predict for each ID the next 100, 200, and 300 time points.
A limitation is that only transformer-based time series models should be used.
Given the goal and the limitation, I assume that autoregressive transformer-based models are the only way to provide prediction with such long prediction horizons (given the limited input of 100 time points).
Which transformer-based time series model architecture in neuralforecast would fit best this problem?
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
I am given a dataset that includes thousands of binary time series each with a hundert time points. Here is a toy example of this data:
The goal is to predict for each ID the next 100, 200, and 300 time points.
A limitation is that only transformer-based time series models should be used.
Given the goal and the limitation, I assume that autoregressive transformer-based models are the only way to provide prediction with such long prediction horizons (given the limited input of 100 time points).
Which transformer-based time series model architecture in neuralforecast would fit best this problem?
Thanks for any advice.
Beta Was this translation helpful? Give feedback.
All reactions