From adf9b1a88983bd065b7c0dbfd25f0b313bbaf2ba Mon Sep 17 00:00:00 2001 From: Brad Windsor Date: Tue, 3 Oct 2023 13:59:58 -0400 Subject: [PATCH] Remove `get_lr()` from logs which refers to nonexistent function `get_lr()` is called as part of this function, but the function is not declared anywhere in the script. This change removes this portion of the code since it is non-necessary. --- chapters/en/chapter7/6.mdx | 3 +-- 1 file changed, 1 insertion(+), 2 deletions(-) diff --git a/chapters/en/chapter7/6.mdx b/chapters/en/chapter7/6.mdx index 7a498a863..6c6418c75 100644 --- a/chapters/en/chapter7/6.mdx +++ b/chapters/en/chapter7/6.mdx @@ -870,7 +870,6 @@ for epoch in range(num_train_epochs): if step % 100 == 0: accelerator.print( { - "lr": get_lr(), "samples": step * samples_per_step, "steps": completed_steps, "loss/train": loss.item() * gradient_accumulation_steps, @@ -912,4 +911,4 @@ And that's it -- you now have your own custom training loop for causal language -{/if} \ No newline at end of file +{/if}