Flair: Introducing Attentional BiLSTM for Text Classification

Created on 6 Feb 2019  路  11Comments  路  Source: flairNLP/flair

As per my understanding , Flair by default uses the standard BiLSTM model for text classification.

I guess introducing Attentional BiLSTM might improve this maybe?

An easy solution to this might be to just introduce a parameter named attention in the TextClassifier class that decides whether to use attention or not

wontfix

Most helpful comment

@alanakbik , I am on it

All 11 comments

If this issue is a go, I would love to take up the challenge

That's a great idea - we'd very much appreciate it if you added this! :)

@alanakbik , I am on it

@53X excited to see this implementation

Here is an interesting approach implemented also in PyTorch that could be useful:
https://github.com/hantek/SelfAttentiveSentEmbed#third-party-implementations

@heukirne , we could introduce this as well as another improved version of the Text Classifier.
BTW I will make this PR in another 2 days.... I was out :)

@alanakbik, can this be merged if this is working fine now

Ok, I've just put in a PR #582 to merge the branch into master. Could you double-check that everything looks OK from your side in the PR?

this looks ok @alanakbik

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

Was this feature removed from later versions?

Was this page helpful?
0 / 5 - 0 ratings

Related issues

stefan-it picture stefan-it  路  3Comments

frtacoa picture frtacoa  路  3Comments

ciaochiaociao picture ciaochiaociao  路  3Comments

happypanda5 picture happypanda5  路  3Comments

gopalkalpande picture gopalkalpande  路  3Comments