Beam search using my own transformer model

Hi, Is there a way to use openNMT beam search on my own Transformer Network implementation. I am using an encoder-decoder architecture in which the input is a sequence of images and the output is the translation. In inference, I am currently using greedy decoding, where i first encode the source sequence: memory = model.encode(src, src_mask) and then model.decode(memory, target, src_mask, target_mask) to get the output sequence. I couldn’t find a good beam search implementation that i could adapt in my case. I appreciate any feedback. Thank you.

Hi,
You might want to have a look at this issue.