Update dropout and label smoothing parameters in training

I want Update dropout and label smoothing parameters in training, does OpenNMT support this? what should i do ?

one of my attempt is first train a model, then give different parameters train from that model, like didn’t work. the follow is my command:
first:
python train.py -data data-10best-shuf-char-only/demo -save_model 10best-shuf-char-only-model
-layers 6 -rnn_size 512 -word_vec_size 512 -transformer_ff 2048 -heads 8
-encoder_type transformer -decoder_type transformer -position_encoding
-train_steps 100000 -max_generator_batches 2 -dropout 0.1
-batch_size 4096 -batch_type tokens -normalization tokens -accum_count 2
-optim adam -adam_beta2 0.998 -decay_method noam -warmup_steps 8000 -learning_rate 2
-max_grad_norm 0 -param_init 0 -param_init_glorot
-label_smoothing 0.1 -valid_steps 10000 -save_checkpoint_steps 10000
-world_size 4 -gpu_ranks 0 1 2 3 -log_file log.shuf.char.only.txt

then:
python train.py -data data-10best-shuf-char-only/demo -save_model 10best-shuf-char-only-tune-model
-layers 6 -rnn_size 512 -word_vec_size 512 -transformer_ff 2048 -heads 8
-encoder_type transformer -decoder_type transformer -position_encoding
-train_steps 200000 -max_generator_batches 2 -dropout 0.0
-batch_size 4096 -batch_type tokens -normalization tokens -accum_count 2
-optim adam -adam_beta2 0.998 -decay_method noam -warmup_steps 8000 -learning_rate 1
-max_grad_norm 0 -param_init 0 -param_init_glorot
-label_smoothing 0.0 -valid_steps 10000 -save_checkpoint_steps 10000
-world_size 4 -gpu_ranks 0 1 2 3 -log_file log.shuf.char.only.tune.txt
-train_from 10best-shuf-char-only-model_step_100000.pt

We added a dropout update mechanism a while ago, which works like the gradient accumulation scheduler.
You can apply a similar logic to label_smoothing quite easily.

As for updating some opts using -train_from, you can have a look at this thread.