Fairseq: does fairseq-train support finetune with "bart_base"

Created on 17 Jul 2020  ·  4Comments  ·  Source: pytorch/fairseq

❓ Questions and Help

When I set the parameter arch as "bart_base", I have the following errors

fairseq-train: error: argument --arch/-a: invalid choice: 'bart_base' (choose from 'transformer', 'transformer_iwslt_de_en', 'transformer_wmt_en_de', 'transformer_vaswani_wmt_en_de_big', 'transformer_vaswani_wmt_en_fr_big', 'transformer_wmt_en_de_big', 'transformer_wmt_en_de_big_t2t', 'transformer_align', 'transformer_wmt_en_de_big_align', 'levenshtein_transformer', 'levenshtein_transformer_wmt_en_de', 'levenshtein_transformer_vaswani_wmt_en_de_big', 'levenshtein_transformer_wmt_en_de_big', 'nonautoregressive_transformer', 'nonautoregressive_transformer_wmt_en_de', 'cmlm_transformer', 'cmlm_transformer_wmt_en_de', 'lightconv', 'lightconv_iwslt_de_en', 'lightconv_wmt_en_de', 'lightconv_wmt_en_de_big', 'lightconv_wmt_en_fr_big', 'lightconv_wmt_zh_en_big', 'lightconv_lm', 'lightconv_lm_gbw', 'fconv', 'fconv_iwslt_de_en', 'fconv_wmt_en_ro', 'fconv_wmt_en_de', 'fconv_wmt_en_fr', 'fconv_lm', 'fconv_lm_dauphin_wikitext103', 'fconv_lm_dauphin_gbw', 'lstm', 'lstm_wiseman_iwslt_de_en', 'lstm_luong_wmt_en_de', 'transformer_from_pretrained_xlm', 'masked_lm', 'bert_base', 'bert_large', 'xlm_base', 'iterative_nonautoregressive_transformer', 'iterative_nonautoregressive_transformer_wmt_en_de', 'insertion_transformer', 'wav2vec', 'fconv_self_att', 'fconv_self_att_wp', 'roberta', 'roberta_base', 'roberta_large', 'xlm', 'multilingual_transformer', 'multilingual_transformer_iwslt_de_en', 'transformer_lm', 'transformer_lm_big', 'transformer_lm_baevski_wiki103', 'transformer_lm_wiki103', 'transformer_lm_baevski_gbw', 'transformer_lm_gbw', 'transformer_lm_gpt', 'transformer_lm_gpt2_small', 'transformer_lm_gpt2_medium', 'transformer_lm_gpt2_big', 'bart_large')

sorted(a)
['bart_large', 'bert_base', 'bert_large', 'cmlm_transformer', 'cmlm_transformer_wmt_en_de', 'fconv', 'fconv_iwslt_de_en', 'fconv_lm', 'fconv_lm_dauphin_gbw', 'fconv_lm_dauphin_wikitext103', 'fconv_self_att', 'fconv_self_att_wp', 'fconv_wmt_en_de', 'fconv_wmt_en_fr', 'fconv_wmt_en_ro', 'insertion_transformer', 'iterative_nonautoregressive_transformer', 'iterative_nonautoregressive_transformer_wmt_en_de', 'levenshtein_transformer', 'levenshtein_transformer_vaswani_wmt_en_de_big', 'levenshtein_transformer_wmt_en_de', 'levenshtein_transformer_wmt_en_de_big', 'lightconv', 'lightconv_iwslt_de_en', 'lightconv_lm', 'lightconv_lm_gbw', 'lightconv_wmt_en_de', 'lightconv_wmt_en_de_big', 'lightconv_wmt_en_fr_big', 'lightconv_wmt_zh_en_big', 'lstm', 'lstm_luong_wmt_en_de', 'lstm_wiseman_iwslt_de_en', 'masked_lm', 'multilingual_transformer', 'multilingual_transformer_iwslt_de_en', 'nonautoregressive_transformer', 'nonautoregressive_transformer_wmt_en_de', 'roberta', 'roberta_base', 'roberta_large', 'transformer', 'transformer_align', 'transformer_from_pretrained_xlm', 'transformer_iwslt_de_en', 'transformer_lm', 'transformer_lm_baevski_gbw', 'transformer_lm_baevski_wiki103', 'transformer_lm_big', 'transformer_lm_gbw', 'transformer_lm_gpt', 'transformer_lm_gpt2_big', 'transformer_lm_gpt2_medium', 'transformer_lm_gpt2_small', 'transformer_lm_wiki103', 'transformer_vaswani_wmt_en_de_big', 'transformer_vaswani_wmt_en_fr_big', 'transformer_wmt_en_de', 'transformer_wmt_en_de_big', 'transformer_wmt_en_de_big_align', 'transformer_wmt_en_de_big_t2t', 'wav2vec', 'xlm', 'xlm_base']

question

Most helpful comment

The author is wrong or did not further check before closing this. @myleott

I found the newer version of fairseq instead of the pip version support this.

All 4 comments

There's bert_base and bart_large, but no bart_base.

The author is wrong or did not further check before closing this. @myleott

I found the newer version of fairseq instead of the pip version support this.

Ah, right, the bart_base architecture was added later. Using the latest version should fix it

The author is wrong or did not further check before closing this. @myleott

I found the newer version of fairseq instead of the pip version support this.

How to install the latest version,I installed fairseq-0.9.0 with PIP and still reported the error

Was this page helpful?
0 / 5 - 0 ratings