Pruning useless weights


(Zuzanna Parcheta) #1

Hi everyone! I read some studies about Pruning useless weights of translation models which can save model size about 90% [https://arxiv.org/pdf/1606.09274.pdf]. I would like to get model weights and make a study like in https://blog.gongzhitaao.org/useless-parameters/ but I don’t know how to get weights. I know to load the model but I don’t see property “weights” or something similar.
I use the following code to search weights.

function dir(obj)
    for k, v in pairs(obj) do print(k) end
end

require('onmt.init')
require('cutorch')
model= torch.load("/home/German/datasets/Lingvanex/EN-ES/sciling-corpus/exp17_12_19/models/_epoch10_3.82.t7")
dir(model) -- dir(model.models)

I’m not an expert in lua so please tell me how can I access weights of each layer.

Regards


(Guillaume Klein) #2

Hello,

There are some examples in the tools directory:


(jean.senellart) #3

Hello @Sasanita,

you can also check here: https://github.com/harvardnlp/seq2seq-attn/commit/58cda6d7db846dca59797b1a283978a8fbd3e22d - I did implement this paper (and the different pruning modes) on seq2seq-attn which is the parent project of OpenNMT (prune.lua here: https://github.com/harvardnlp/seq2seq-attn/blob/master/prune.lua). there should not be much change to adapt to OpenNMT.

best
Jean


(Zuzanna Parcheta) #4

Thank you! I will check it!
Regards