This repository has been archived by the owner on Sep 29, 2023. It is now read-only.
Create a more standard training loop interface for pretraining #8
Labels
enhancement
New feature or request
Currently,
clmbr_train_model
conceals more familiar training loops structure from users. In most demos and APIs, the boilerplate looks like what's outlined here https://github.com/PyTorchLightning/pytorch-lightning with this structurebasically the form
Specific details around the loss are configured in the model architecture and the trainer class handles stuff like progress bars, choice of optimizer, etc.
What is the lift required to provide a demo and refactor to support this type of workflow?
The text was updated successfully, but these errors were encountered: