-
Notifications
You must be signed in to change notification settings - Fork 48
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Working on setup and train functionality #327
base: main
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @reed0824, here are some comments on your latest draft. In addition to the itemized recommendations, I suggest that future PRs come from a branch of your forked repo, rather than your main
branch. To create a new branch called branchname
in the command line, you can use git checkout -b branchname
.
def __init__( | ||
self, | ||
**kwargs, | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Probably can remove this __init__
method. setup
will take care of all the initializations.
# | ||
def summary(self, **kwargs): | ||
|
||
model = xgb.train( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remember to use self.model
here, in addition to any other places where you'd like to set an attribute (variable accessible by all the different methods in the XGB
class).
def setup( | ||
self, | ||
'max_depth':6, | ||
'min_child_weight': 1, | ||
'eta':.1, | ||
'subsample': 0.7, | ||
'colsample_bytree': 0.7, | ||
'objective': 'binary:logistic', | ||
'eval_metric': 'auc', | ||
**kwargs, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice, I think this is a good way to initialize the model parameters.
self.model.compile( | ||
params =self.meta["params"], | ||
train = self.dtrain, | ||
test = self.dtest, | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't see compile
in the XGBoost notebook. Should it be included here or at all?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Possibly not; I copied the same format as nn.py here and model.compile does not yet exist for xgb
|
||
def train(self,X_train, **kwargs): | ||
dtrain = xgb.DMatrix(X_train, label=y_train) | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is a good place to add something like self.model = xgb.train(...)
.
|
||
return dtrain | ||
|
||
def test(self, X_test, **kwargs); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not sure if we need a separate test method. Perhaps it's best for now to define dtest
like you do for dtrain
within the train
method, then pass it in to the call to xgb.train(...)
, e.g. using evals=[(dtest, "Test")]
. You'd need to add Xtest
as another argument to the train
method.
I've updated the setup() function, and added train/call/summary/test to the class. Any comments are welcome if you have time, Brian.
I have temporarily removed plotting functionality