OpenNMT Forum

Problem in using extract_embeddings.lua

Hello
I want to use extract_embeddings.lua from ma trained model in opennmt-tf.
I use
th tools/extract_embeddings.lua -model /My_path_to_model/ -output_dir /My_output_path -gpuid 1
but I get this error:

Loading model ’ /My_path_to_model/’…
/root/torch/bin/luajit: tools/extract_embeddings.lua:39: unable to load the model (/root/torch/share/lua/5.1/torch/File.lua:259: read error: read 0 blocks instead of 1 at /root/torch-distro/pkg/torch/lib/TH/THDiskFile.c:352). If you are extracting a GPU model, it needs to be loaded on the GPU first (set -gpuid > 0)
stack traceback:
[C]: in function ‘error’
tools/extract_embeddings.lua:39: in function ‘main’
tools/extract_embeddings.lua:81: in main chunk
[C]: in function ‘dofile’
/root/torch/lib/luarocks/rocks/trepl/scm-1/bin/th:150: in main chunk
[C]: at 0x00405d50

But I can’t figure out the reason of this error.

P.S: I use opennmt docker image and also I use --gpus all when I want to run my docker.

thnk you for your help.
Hadis

Hi,

OpenNMT-lua is not compatible in any way with OpenNMT-tf.

You could extract the embeddings with a small Python script. Hopefully you can adapt it to your need:

import tensorflow as tf
import opennmt

model = opennmt.models.TransformerBase()
model.initialize({
    "source_vocabulary": path_to_source_vocabulary,
    "target_vocabulary": path_to_target_vocabulary
})
model.create_variables()
checkpoint = tf.train.Checkpoint(model=model)
checkpoint.restore(path_to_checkpoint)

source_embedding = model.features_inputter.embedding.numpy()
target_embedding = model.labels_inputter.embedding.numpy()

thank you.
I will do that.