Steps to convert SentencePiece vocab to OpenNMT-py vocab

If we are going to build vocabulary from scratch for OpenNMT-py, we should use something like this:

# -config: path to your config.yaml file
# -n_sample: use -1 to build vocabulary on all the segment in the training dataset
# -num_threads: change it to match the number of CPUs to run it faster

onmt_build_vocab -config config.yaml -n_sample -1 -num_threads 2

However, many of us use SentencePiece for sub-wording, which generates both a sub-wording model and a vocabulary list. We cannot use this vocab file generated by SentencePiece directly in OpenNMT-py. So, if we want to, we have to convert it to a version compatible with OpenNMT-py. Note that the new vocab file will be 3 lines less as the script removes the default tokens in OpenNMT-py, e.g. <unk>, <s>, and </s>.

pip3 install --upgrade OpenNMT-py
wget https://raw.githubusercontent.com/OpenNMT/OpenNMT-py/master/tools/spm_to_vocab.py
cat spm.vocab | python3 spm_to_vocab.py > spm.onmt_vocab

Kind regards,
Yasmin

2 Likes

@ymoslem Cloned and Installed OpenNMT-py on Colab:
git clone GitHub - OpenNMT/OpenNMT-py: Open Source Neural Machine Translation in PyTorch
python3 /content/OpenNMT-py/setup.py install

we excuted this command:
pip install onmt
cat /content/as_spm_10000.vocab | python3 /content/OpenNMT-py/tools/spm_to_vocab.py > spm.onmt_vocab

We encountered the following error:
Traceback (most recent call last):
File “/content/OpenNMT-py/tools/spm_to_vocab.py”, line 6, in
from onmt.constants import DefaultTokens
ModuleNotFoundError: No module named ‘onmt.constants’

This is not OpenNMT-py. Please try the following:

pip3 install --upgrade OpenNMT-py
wget https://raw.githubusercontent.com/OpenNMT/OpenNMT-py/master/tools/spm_to_vocab.py
cat spm.vocab | python3 spm_to_vocab.py > spm.onmt_vocab

For more about how to run OpenNMT-py, feel free to refer to this tutorial:

Kind regards,
Yasmin