Flair: Trainable layers for transformers

Created on 22 Oct 2019  路  4Comments  路  Source: flairNLP/flair

Is it possible to make the layers of bert and other models trainable?

question

Most helpful comment

Hi @isarth,

fine-tuning of a BERT model is currently not possible in flair (only a "feature-based" approach, like it's called in the BERT paper).

But with the latest version of Transformers it is possible to fine-tune a model on a NER dataset.

After fine-tuning you could pass that model into flair if you want :)

All 4 comments

Hi @isarth,

fine-tuning of a BERT model is currently not possible in flair (only a "feature-based" approach, like it's called in the BERT paper).

But with the latest version of Transformers it is possible to fine-tune a model on a NER dataset.

After fine-tuning you could pass that model into flair if you want :)

Hi @isarth,

There's an open source library here https://github.com/Novetta/adaptnlp built atop Flair called AdaptNLP that let's you fine-tune language models like BERT with a ULM-FIT approach.

The fine-tuner provides some straightforward built-in layer freezing methods for your convenience in fine-tuning the language model.

The library is still in its early beta development, but tries to incorporate an approach that works seamlessly with Flair. Feel free to leave any issues or requests in regards to what you're trying to do!

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

As of Fair 0.5, transformer embeddings are fine-tuneable through Flair and the recommended way to train text classifiers. So closing this issue but feel free to reopen with more comments.

Was this page helpful?
0 / 5 - 0 ratings

Related issues

prematurelyoptimized picture prematurelyoptimized  路  3Comments

Aditya715 picture Aditya715  路  3Comments

ciaochiaociao picture ciaochiaociao  路  3Comments

inyukwo1 picture inyukwo1  路  3Comments

shoarora picture shoarora  路  3Comments