Transformers: run_ner.py file with Distill Bert

Created on 16 Oct 2019  ยท  7Comments  ยท  Source: huggingface/transformers

โ“ Questions & Help


I wish to use the Distill Bert model for NER. I am not sure if it will work with it directly. Any suggestions on that end would be great.
Also, what values should the parameters --model_type and --model_name_or_path take for Distill Bert?
Other parameters per my understanding would be the same.

Most helpful comment

Solved indeed. Thanks everyone for contributing!

All 7 comments

I have the same issue. Did you end up using Distilbert?

Not sure how well it will perform. Casing is an important feature used in many NER tasks. So I would say it _could_ work, but ymmv. For reference: https://stackoverflow.com/questions/56384231/case-sensitive-entity-recognition

RoBERTa is cased so you guys can try using DistilRoBERTa, released today by @VictorSanh:

--model_name_or_path distilroberta-base

You'll probably need to adapt run_ner.py (PR welcome)

Actually working on that today so I'll let you know how it goes.

This is actually really cool. I was looking today at the models and had no idea DistilRoBERTa was released today. Awesome @VictorSanh!

@amankedia I think this issue is resolved? If so we can resolve it :)

Solved indeed. Thanks everyone for contributing!

Was this page helpful?
0 / 5 - 0 ratings