As I noted, the problem is not adding the features, but the tokenization. Aren't the features added after tokenization? If so, then the features functionality depends on the tokenization functionality and that's why it has to be done in the same place, with the same tool and rules, in the server-side.
I realize that this does not fully implement the feature functionality for the source text, because the user cannot tag the source text with whatever way they want -- each source word must be marked with the same feature(s). Still, it would be very useful, for the sole purpose of allowing domain recognition.
Anyway, I looked at the code in github and I think this can be easily implemented -- probably not for me since I haven't ever written a single line of lua . I will try to come up with something though if my request is not accepted. For starters, can you please verify that the starting point should be a new module that will implement the logic of
case.lua, and particularly a function similar to