Tokenization Command Error

I have picked pre processing command from the documentation:

for l in en de; do for f in data/multi30k/.$l; do if [[ “$f” != “test” ]]; then sed -i “$ d” $f; fi; done; done
for l in en de; do for f in data/multi30k/
.$l; do perl tools/tokenizer.perl -a -no-escape -l $l -q < $f > $f.atok; done; done
python preprocess.py -train_src data/multi30k/train.en.atok -train_tgt data/multi30k/train.de.atok -valid_src data/multi30k/val.en.atok -valid_tgt data/multi30k/val.de.atok -save_data data/multi30k.atok.low -lower

Can you please guide me how/where to run these commands?
When I run there commands in Annaconda command prompt I get an error “l was unexpected at this time”.