Hello there!
This is related to R&D on my behalf.
Is there a way that we can fine tune BERT for multitasks at the the same time, What I mean is
One BERT/Roberta model in a single pytorch/keras model is fine-tuned for two or more tasks say
We have some multitasking support in an internal branch. I can work on cleaning it up and releasing it.
@myleott any progress on that?
Most helpful comment
We have some multitasking support in an internal branch. I can work on cleaning it up and releasing it.