Since its launch in December 2016, a lot happened around the OpenNMT initiative: new features, new use cases, new users, a new implementation in PyTorch, etc. Today, we continue in this dynamic and publish OpenNMT-tf, an experimental TensorFlow alternative of OpenNMT and OpenNMT-py.
OpenNMT-tf already supports a lot of the features of the existing versions but thanks to a modular design it allows several new model variants:
- mixed word/character embeddings
- arbitrarily complex encoder architectures (mixing RNN, CNN, self attention, etc.)
- hybrid encoder-decoder model (self attention encoder and RNN decoder)
- multi source inputs (e.g. source text + Moses translation for machine translation)
and all of the above can be used simultaneously! The project also makes use of some of the best TensorFlow features:
- distributed training
- monitoring with TensorBoard
- inference with TensorFlow Serving
Testing, feedback, and contributions are highly wanted in these early stages (rough edges are to be expected!). Thank you!
Also see the OpenNMT website to learn how the 3 versions are related and what are their goal. All versions will remain supported.