SoFunction
Updated on 2024-11-18

Pytorch freezes the training method based on the name of the layers

Using model.named_parameters() makes this easy.

()
 
 
# ######################################## Froze some layers to fine-turn the model ########################
for name, param in model.named_parameters(): # Iteration of parameters contained in each layer of the model with parameter names
  if 'out' or 'merge' or 'before_regress' in name: # Determine if certain keywords are included in the parameter name string
    continue
  param.requires_grad = False
# #############################################################################################################
 
 
optimizer = (filter(lambda p: p.requires_grad, ()),
           lr=opt.learning_rate * args.world_size, momentum=0.9, weight_decay=5e-4)

Above this Pytorch freezes the training method according to the name of the layers is all that I have shared with you, I hope to give you a reference, and I hope you will support me more.