Translate with v3 English-German - Transformer Large

I’m trying to use the pretrained model in the subject, using the linked bpe model.

My command is

python translate.py -model /path/to/ende-large-withoutBT.pt -s
rc /path/to/text.en -output /path/to/text.de -verbose -transforms bpe --src_subword_model /path/to/subwords.en_de.bpe --tgt_subword_model /path/to/subwords.en_de.bpe

Now, one sentence to translate is:

I am attempting a very contrived utterance with underused words to experience the subword splitting capabilities of this software

and after bpe segmentation it is full of unknown tokens (the same without BPE)

['<unk>', 'am', '<unk>', 'ting', 'a', 'very', '<unk>', 'ved', '<unk>', 'ance', 'with', '<unk>', 'used', 'words', 'to', 'experience', 'the', '<unk>', 'word', '<unk>', 'ting', 'capabilities', 'of', 'this', 'software']

What am I missing?

I found that the issue is the discrepancy between the separators applied. The BPE algorithm uses “@@” but in the vocab we have “■”