Freezing partial weights after n epochs #18572
Unanswered
johnathanchiu
asked this question in
DDP / multi-GPU / multi-node
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I want to be able to freeze a portion of my model after n epochs of training. I used the
on_train_epoch_end
callback to set those parameters torequires_grad=False
. The issue I encounter now is that immediately after that, it hits a DDP unused parameters issue. Is there a correct way of freezing a portion of the model after n steps and allowing continuous training?Beta Was this translation helpful? Give feedback.
All reactions