Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] Log metrics to visualizer on test run #1450

Open
InakiRaba91 opened this issue Dec 3, 2023 · 0 comments
Open

[Feature] Log metrics to visualizer on test run #1450

InakiRaba91 opened this issue Dec 3, 2023 · 0 comments

Comments

@InakiRaba91
Copy link

InakiRaba91 commented Dec 3, 2023

What is the feature?

I think the metrics computed during a test run to evaluate a model are not being logged to the visualizer, (i.e.: MLflow). They're passed to the logger hook here, but there's no call to the visualizer as in train/val.

I've submitted a tentative PR with the proposed change. Let me know if I'm missing something obvious and this is already covered. Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant