OpenNMT-tf: a new alternative

Since its launch in December 2016, a lot happened around the OpenNMT initiative: new features, new use cases, new users, a new implementation in PyTorch, etc. Today, we continue in this dynamic and publish OpenNMT-tf, an experimental TensorFlow alternative of OpenNMT and OpenNMT-py.

OpenNMT-tf already supports a lot of the features of the existing versions but thanks to a modular design it allows several new model variants:

  • mixed word/character embeddings
  • arbitrarily complex encoder architectures (mixing RNN, CNN, self attention, etc.)
  • hybrid encoder-decoder model (self attention encoder and RNN decoder)
  • multi source inputs (e.g. source text + Moses translation for machine translation)

and all of the above can be used simultaneously! The project also makes use of some of the best TensorFlow features:

Testing, feedback, and contributions are highly wanted in these early stages (rough edges are to be expected!). Thank you!

Also see the OpenNMT website to learn how the 3 versions are related and what are their goal. All versions will remain supported.

11 Likes

Hi @guillaumekln ,

congrats on the release.

A couple of questions that have immediately came to mind:

  1. How does OpenNMT-tf compare to the tensor2tensor?
  2. What is your opinion on TF’s Eager in terms of onmt-tf?

Best,
Maksym

Hello @maxdel,

  1. Tensor2Tensor is amazing but it is a very large project with a scope broader than OpenNMT-tf. We will mostly focus on NLP tasks (in particular machine translation) and implement features specific to this domain. Also, OpenNMT-tf have a lower level modularity that allows the features described in the README (mixed encoder, hybrid models, multi inputs, etc.).
  2. The Eager mode is promising but seems to have a long way to go. It could definitely ease the implementation of dynamic decoding so we should keep an eye on it.

Thanks.

1 Like

So where does opennmt-tf log by default? Tensorboard needs an input directory to show what is happening…

Your configuration file set model_dir, the directory that will contain all checkpoints and events. You can make TensorBoard point to that directory or a parent directory.

1 Like

That worked. Where are the translations saved? They are output to the terminal, but I like to have them in a text file.