C++ Support for BRNN models inference


I have trained a OpenNMT-py translation model using BRNN encoder and RNN decoder. However, I noticed CTranslate2 only supports Transformer models. I was wondering if there is C++ inference support for BRNN OpenNMT-py models. Could someone please help ?



It seems the easiest is just to retrain with a Transformer and then use CTranslate2.

Ok thx! Does other options like CopyGenerator work with CTranslate2 along with Transformer encoder and decoder ?
And just out of curiosity, I was wondering if there are plans to support other non-Transformer encoder decoder modules in CTranslate2 ?

No. The supported architectures are listed here: https://github.com/OpenNMT/CTranslate2#converting-models

At the moment there are no plans. But if an architecture becomes frequently used in production, we will consider supporting it.