Fairseq: BERT/Roberta MULTITASKING

Created on 16 Jan 2020  路  2Comments  路  Source: pytorch/fairseq

Hello there!
This is related to R&D on my behalf.
Is there a way that we can fine tune BERT for multitasks at the the same time, What I mean is
One BERT/Roberta model in a single pytorch/keras model is fine-tuned for two or more tasks say

  1. Sentiment Analysis
  2. Question Answering
  3. English to German Translation
    and so on
enhancement

Most helpful comment

We have some multitasking support in an internal branch. I can work on cleaning it up and releasing it.

All 2 comments

We have some multitasking support in an internal branch. I can work on cleaning it up and releasing it.

@myleott any progress on that?

Was this page helpful?
0 / 5 - 0 ratings