-
Notifications
You must be signed in to change notification settings - Fork 14
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Modify finetune function #69
Conversation
os.path.splitext(os.path.basename(label_path))[0] + "_skel.mrc", | ||
) | ||
|
||
store_tomogram(filename=out_file, tomogram=ske) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As raised in #82, this will lose the information of the header.
Can we load the tomogram before calling skeletonization and pass it instead of label path?
load_tomogram gives you a Tomogram instance (
class Tomogram: |
store_tomogram
instead of ske
def on_epoch_start(self, trainer, pl_module): | ||
current_lr = trainer.optimizers[0].param_groups[0]["lr"] | ||
print(f"Epoch {trainer.current_epoch}: Learning Rate = {current_lr}") | ||
|
||
print_lr_cb = PrintLearningRate() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we need this callback in the standard function for the user?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think no
@@ -266,3 +272,112 @@ def forward( | |||
# Normalize loss | |||
loss = loss / sum(self.weights) | |||
return loss | |||
|
|||
|
|||
class PrintLearningRate(Callback): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Okay to keep this here, but I think this is not necessary to have actively used in the fine-tune functionality. I guess this is only useful for debugging
* add skeletonization code * Second commit * Second commit * Second commit * Second commit * Third commit * Third commit * Fourth commit * Fourth commit * Fix data type warning and absolute value error * Add finetune function * Modify finetune function * Add Fine-tuning.md file --------- Co-authored-by: Hanyi11 <[email protected]> Co-authored-by: Hanyi Zhang <[email protected]> Co-authored-by: Hanyi Zhang <[email protected]> Co-authored-by: Hanyi Zhang <[email protected]>
Main Changes:
tomo.data
is now passed into skeletonization, and the resulting skeleton replaces the originaltomo.data
for saving.PrintLearningRate
callback from the fine-tuning function.