The basic idea is that all models have a function model.children() which
 returns it’s layers. Within each layer, there are parameters (or 
weights), which can be obtained using .param() on any children (i.e. 
layer). Now, every parameter has an attribute called requires_grad which
 is by default True. True means it will be backpropagrated and hence to 
freeze a layer you need to set requires_grad to False for all parameters
 of a layer. This can be done like this -
 model_ft = models.resnet50(pretrained=True)
ct = 0
for child in model_ft.children():
    ct += 1
    if ct < 7:
        for param in child.parameters():
            param.requires_grad = False
 
This freezes layers 1-6 in the total 10 layers of Resnet50. Hope this helps!
p/s: https://discuss.pytorch.org/t/how-the-pytorch-freeze-network-in-some-layers-only-the-rest-of-the-training/7088/2 
No comments:
Post a Comment