-
Notifications
You must be signed in to change notification settings - Fork 746
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
multi stock data training #20
Comments
I think you need to change parameter named "STOCK_DIM" which is total number of stocks in our portfolio in Env train, validation,trade files. |
I don't think so, in this repo all stocks in STOCK_DIM form a stock portfolio, in other words, after a prediction, people must do every ops(sell, buy, nothing) according to prediction result, but if I have STOCK_DIM=3000, I can't buy all of them, it may need a lot of money. |
From my point of view buy and sell actions are ranked according to the ammount to buy at each time step. So you are right if you have 3000 stocks in your action space hence possibly buy every stock of the 3000 then it can happen that not every stock is bought because the buy function is intitialized in a for loop and the actions are sorted according to the amount to buy maybe you should increase the intitial account balance from 100.000 to a higher amount or lower the HMAX parameter to control the rescaling of the actions because the PPO/A2C... rely on a gaussian distribution and are scaled to [-1,1] |
@emjay032 thank you, I'm trying to another method, that is every stock is one episode even in environment, so one episode is done, another one again, I'm doing it |
for what I understand, it needs 30 stock data to train a model, if there is a lot of stocks(> 30), I want to use all of them to train the model, how can I do that?
for example, firstly I use 30 stock data, then call reset() function, and read next 30 stock data, continue to train model parameters, the key point is that I do not reinit model
is this will work?
The text was updated successfully, but these errors were encountered: