Gpt2 benchmark in OpenNMT-TF

Do we have a checkpoint for gpt2 model trained by OpenNMT-TF, just like transformer en-de? I am curious about the inference performance.

Thanks

No, there is no pretrained model for GPT-2.