Use a pre-trained Bert model for for sequence classification

Hello, I am new with OpenNMT and would need some help here.
I want to build a sequence classification model. For that, I already have a pre-trained Bert model for sequence to sequence translation. What I would like to know is:

  1. How can I to remove its last layer and add a classification head
  2. Re-train the modified model with my data.
  3. How to get the output embeddings from a specific layer? This way I will be able to build the classification using another framework

Is it a task that can be done using OpenNMT? How can I do it?

In case it cannot be done, how can I convert the model to huggingface format?
Thanks in advance!

It’s not supported in the released versions, but there is an existing branch that can make it work.
Check this topic: OpenNMT-py BERT Tutorial

This will probably require some tinkering.

Thanks! This tutorial is basically what I wanted :sunny: