OpenNMT Forum

GPT-2 for french


I would like to do text generation in french using a language model such as GPT-2.

I want to fine-tune it on a specific corpus. My problems are:

(1) There is no available good GPT-2 for french trained from scratch like the english one, hence I cannot use it as a base to fine tune it.

(2) Some people try to use the english base and fine tune it in french, it seems to work technically but results are quite poor in my opinion.

I only have a 2080ti for the moment, I would really appreciate any advice to make it work.

P.S. I am not using [MASK] based language models like BERT since they perform poorly in text generation.

Did you make any progress? Are you considering just training from scratch?

Mask based models like XLM, MASS or mBART are doing good in machine translation and other tasks if they are fine tuned on the whole encoder-decoder architecture. Do you just want to generate text or do you have a task like machine translation for a seq2seq model?


Have you tried with transformers?

Bel-Gpt2 seems like a good start. You may want to fine-tune it if that is necessary in your case.