I’m having the same problem. I have words that appears in dictionary but in test they are translated as unk. Could it be because the network is not well trained? It should have more layers, more neurons and epochs?
I had this issue before. If you have a huge vocabulary, default 50004 for vocabulary size is chosen during preprocessing step, which affects the predictions. A large fraction of less frequent vocabulary will be predicted as unknowns even during training, due to this reason. Please check if that’s the case.