Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Default optimizer in Scholar.Linear.LogisticRegression #300

Closed
krstopro opened this issue Sep 12, 2024 · 3 comments
Closed

Default optimizer in Scholar.Linear.LogisticRegression #300

krstopro opened this issue Sep 12, 2024 · 3 comments

Comments

@krstopro
Copy link
Member

It seems there is an issue with the training of Scholar.Linear.LogisticRegression; see here. I believe this is because SGD is used as a default optimizer. Setting optimizer: adam in fit/3 solves the problem. Perhaps we should change the default optimizer?

@josevalim
Copy link
Contributor

Sounds good to me.

@krstopro
Copy link
Member Author

krstopro commented Sep 12, 2024

The issue happens to be more complex than the default optimizer. Using Adam instead of SGD seems to have worked with the binary backend, not with EXLA. However, I am not able to investigate further as cannot install EXLA on my M1 Mac for some reason. Is this expected?

@krstopro
Copy link
Member Author

Closing this issue as the problem doesn't seem to be in optimizer. See #301.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants