Flair: How to set evaluation metric during training?

Created on 30 Jul 2020  路  6Comments  路  Source: flairNLP/flair

I was doing a text classification and wanted to use macro_f1 score as evaluation matric. How do I set this up during training?

From #333 , @alanakbik suggested to set it like this:
trainer.train('./', evaluation_metric=EvaluationMetric.MACRO_F1_SCORE, max_epochs=2)
However it didn't work in 0.51, I'm getting an unexpected keyword argument error.

question wontfix

Most helpful comment

Hello all, yes we took out the option to pass a different metric some time back. I think there were some errors. We want to put this feature back in, though I am not sure when we can get around to doing this. Hopefully soon!

All 6 comments

Maybe it was available in older version, because I am not able to find the above metric in the trainer class.
Check this out : https://github.com/flairNLP/flair/blob/master/flair/trainers/trainer.py#L62

I guess the experts might give a better answer. Cheers!

Maybe it was available in older version, because I am not able to find the above metric in the trainer class.
Check this out : https://github.com/flairNLP/flair/blob/master/flair/trainers/trainer.py#L62

I guess the experts might give a better answer. Cheers!

Thanks @nightlessbaron for the response!

Hi, if there is any follow up to this, would be great--i'm also trying to find the metric class to set evaluation metrics but note that the train function no longer allows for the evaluation metric enum to be included.

Hi, if there is any follow up to this, would be great--i'm also trying to find the metric class to set evaluation metrics but note that the train function no longer allows for the evaluation metric enum to be included.

Hi @andrewlaikh , till this gets sorted out, you could edit the source code here to change the evaluation metric to something other than micro_f1_score

https://github.com/flairNLP/flair/blob/44aac4fb6821787ade513d4b0d93ac7040f8fd95/flair/models/text_classification_model.py#L365

Hello all, yes we took out the option to pass a different metric some time back. I think there were some errors. We want to put this feature back in, though I am not sure when we can get around to doing this. Hopefully soon!

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

Was this page helpful?
0 / 5 - 0 ratings