Using EmbeddingsSharingLevel in a dual source transformer

You can ignore this warning. It is related to how we assign the embedding in the build method which is not exactly the cleanest way to do that, but the embedding weight should have the correct value at the end.

1 Like

@guillaumekln Hi Guillaume, I bumped into an issue when using the architecture above (using shared embeddings between all inputs and the target). I want to use the replace_unknown_target functionality by adding to params, but during the first evaluation run it gives me this error. Any thoughts?

TypeError: replace_unknown_target is only defined when the source inputter is a WordEmbedder

Yes, this option is not implemented for multi source inputs.