Fairseq: How to save model output from fairseq-generate?

Created on 14 Apr 2020  路  4Comments  路  Source: pytorch/fairseq

I just follow the tutorial and stuck on this command:

fairseq-generate data-bin/iwslt14.tokenized.de-en \
    --path checkpoints/fconv/checkpoint_best.pt \
    --batch-size 128 --beam 5

How can I save model output on test part of my data? I spent a solid amount of time, but didn't find the answer. I found --results-path argument, but for some reason, it doesn't work for me and save data in a strange format, like H- .... Is there just to save the model output (predictions) on particular data?
Sorry, if this question is obvious, but I didn't find anything in docs.

question

Most helpful comment

You can just grep what's in your --results-path file to get the output. Otherwise afaik there isn't a way to get just the outputs.

grep ^T output.txt | cut -f2- > target.txt
grep ^H output.txt | cut -f3- > hypotheses.txt

All 4 comments

You can just grep what's in your --results-path file to get the output. Otherwise afaik there isn't a way to get just the outputs.

grep ^T output.txt | cut -f2- > target.txt
grep ^H output.txt | cut -f3- > hypotheses.txt

Yep, @Alex-Fabbri is right!

Thanks for the swift answer

It would be great if there was a way to specify an output file on the command line. Currently I am facing issues because the console I am printing to does not have the necessary fonts.

Was this page helpful?
0 / 5 - 0 ratings