Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature Request] Support plugin for loss (in pytorch backend) #4189

Open
ChiahsinChu opened this issue Oct 7, 2024 · 0 comments
Open

[Feature Request] Support plugin for loss (in pytorch backend) #4189

ChiahsinChu opened this issue Oct 7, 2024 · 0 comments

Comments

@ChiahsinChu
Copy link
Contributor

Summary

Currently, the loss function module is chosen by deepmd.pt.train.training.get_loss based on the loss type keyword. This means that the user cannot implement a custom loss function module in a plugin, without changing the code in the deepmd-kit package. I wonder whether it is possible to have a hook for loss function module, as what has been done for the descirptors, etc.

Detailed Description

See above.

Further Information, Files, and Links

No response

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant