Training a baseline pointer generator model with OpenNMT

The tutorial given for summarization is that of bottom up summarization, which has a pg component but also a contextual encoder.
I wanted to know what part of the code should i consider looking at if i wanted to make a pointer generator model and train it on CNN/Daily mail to get a baseline ready.

Are you looking for this file?