ps aux | grep python | grep -v "grep python demo.py" | awk '{print $2}' | xargs kill -9
Thursday, 16 July 2020
How to kill all process by command name in Ubuntu?
Monday, 6 July 2020
How to transfer dict between python3 and python2?
A better way is to use npy file to transfer dict between python2 and python3, instead of json or pickle.
in python 3:-
np.save('temp.npy', data)
np.save('temp.npy', data)
in python2:-
data = np.load('temp.npy',allow_pickle='TRUE').item()
(note that python 2 must upgrade numpy to latest version)
Friday, 3 July 2020
How to quickly find a process in ubuntu and kill it?
in terminal:-
$ ps axuw | grep <name>
$ ps axuw | grep vim
$ ps axuw | grep <name>
$ ps axuw | grep vim
Thursday, 2 July 2020
How to freeze the pytorch network parameters from training?
The basic idea is that all models have a function model.children() which
returns it’s layers. Within each layer, there are parameters (or
weights), which can be obtained using .param() on any children (i.e.
layer). Now, every parameter has an attribute called requires_grad which
is by default True. True means it will be backpropagrated and hence to
freeze a layer you need to set requires_grad to False for all parameters
of a layer. This can be done like this -
p/s: https://discuss.pytorch.org/t/how-the-pytorch-freeze-network-in-some-layers-only-the-rest-of-the-training/7088/2
model_ft = models.resnet50(pretrained=True)
ct = 0
for child in model_ft.children():
ct += 1
if ct < 7:
for param in child.parameters():
param.requires_grad = False
This freezes layers 1-6 in the total 10 layers of Resnet50. Hope this helps!p/s: https://discuss.pytorch.org/t/how-the-pytorch-freeze-network-in-some-layers-only-the-rest-of-the-training/7088/2
Subscribe to:
Posts (Atom)