Openmt vs openmnt-py similarities and differences

Hi again,

I have a couple of questions related to opennmt-py, excuse if they are obvious.

a) I assume the previous lua opennmt tools wont be migrated (as tokenize.lua and detokenize.lua). I have seen several opennmt-py tools are coming from moses but they are not like the lua versions. I understand that caseing or joiner annotate should be done outside the openmnt-py package. Is this correct?

b) Train uses a loooot of memory If I compare to “old” opennmt. I dont have numbers here by, but for simple models, are using at least 2 times more memory. Is any way to reduce the memory consumption? I was thinking some openmnt models that I was able to run they already were 6MB, this means I will probably wont be able to run them in and end consumer GPU.

c) As rest_translation_server.lua (the api server for the “old” openmt) probably will also become obsolete, I assume that right now the clients we have are the Simple OpenNMT.py REST server that paul has provided (Simple OpenNMT-py REST server) or the CTranslate (https://github.com/OpenNMT/CTranslate). Not sure if CTranslate only was working with the “old” openmnt models. Is this correct?

d) I assume openmt source/target files can be used in opennmt-py. Are the opennmt models and the opennmt-py models compatible? For instance, can I use rest_translation_server.lua with opennmt-py models?

e) One of the things I like about opennmt-py is the ability to create complex models, but to be honest, not sure where to recall some info about how to modify or enhance the standard model (2 x 500) or create a new one. Does anyone knows where to find a suitable model for instance to translate 2 similar languages?

f) Is the openmt-py docker image current? As far as I know, nvidia docker offers close to bare metal performance, so it would be nice to have one current image.

Thanks in advance!
Have a nice day
Miguel

Hi,

Yes, it should be done outside. Tokenization tools have been migrated to OpenNMT/Tokenizer which also provides a Python wrapper.

Try -max_generator_batches 1.

CTranslate only works for OpenNMT-lua models. So you should use the OpenNMT-py REST server which should be good enough.

Text data files are of course compatible but not models.

See for example:

http://opennmt.net/OpenNMT-py/FAQ.html#how-do-i-use-the-transformer-model-do-you-support-multi-gpu

I’m not sure about the version that is pushed to Dockerhub. You should probably rebuild one from the Dockerfile which is pretty simple.