Flair: Are there any details on the structure and type of pre-trained model for sentiment analysis?

Created on 4 Mar 2019  路  6Comments  路  Source: flairNLP/flair

When running the TextClassifier I receive the following warning:

D:\Anaconda35_64\lib\site-packages\torch\serialization.py:538: DeprecationWarning: Call to deprecated class DocumentLSTMEmbeddings. (The functionality of this class is moved to 'DocumentRNNEmbeddings') -- Deprecated since version 0.4.
result = unpickler.load()

Code:
import flair
from flair.models import TextClassifier
from flair.data import Sentence
from tqdm import tqdm, tqdm_notebook
tqdm.pandas(tqdm_notebook)

classifier = TextClassifier.load('en-sentiment')
def f_sent(x):
sentence = Sentence(x)
classifier.predict(sentence)
for label in sentence.labels:
return(label.score)

tweet_text['flair_sent'] = tweet_text['text'].progress_apply(f_sent)

I was looking through the documentation but could not find what type of neural network the pre-trained model is (i.e. LSTM, BiLSTM, GRU etc..). Is there any more detailed information about the pretrained model (iMDB one)?

question

All 6 comments

Hi @seangrant82 - we trained the 'en-sentiment' model for the previous version, so it uses the now-deprecated DocumentLSTMEmbeddings, with a stack of two WordEmbeddings ('en-crawl' and 'en-wiki') and the two standard FlairEmbeddings ('news-forward' and 'news-backward'). However, we did not do much hyperparameter selection here, so there may be better embedding combinations for this task!

`020-01-21 08:36:45,028 loading file /home/ritika/.flair/models/imdb-v0.4.pt

AttributeError Traceback (most recent call last)
in
----> 1 classifier = TextClassifier.load('en-sentiment')
2 def f_sent(x):
3 sentence = Sentence(x)
4 classifier.predict(sentence)
5 for label in sentence.labels:

~/anaconda3/envs/pycvenv/lib/python3.6/site-packages/flair/nn.py in load(cls, model)
84 state = torch.load(f, map_location=flair.device)
85
---> 86 model = cls._init_model_with_state_dict(state)
87
88 model.eval()

~/anaconda3/envs/pycvenv/lib/python3.6/site-packages/flair/models/text_classification_model.py in _init_model_with_state_dict(state)
105 document_embeddings=state["document_embeddings"],
106 label_dictionary=state["label_dictionary"],
--> 107 multi_label=state["multi_label"],
108 )
109

~/anaconda3/envs/pycvenv/lib/python3.6/site-packages/flair/models/text_classification_model.py in __init__(self, document_embeddings, label_dictionary, multi_label, multi_label_threshold)
73
74 # auto-spawn on GPU if available
---> 75 self.to(flair.device)
76
77 def _init_weights(self):

~/anaconda3/envs/pycvenv/lib/python3.6/site-packages/torch/nn/modules/module.py in to(self, args, *kwargs)
423 return t.to(device, dtype if t.is_floating_point() else None, non_blocking)
424
--> 425 return self._apply(convert)
426
427 def register_backward_hook(self, hook):

~/anaconda3/envs/pycvenv/lib/python3.6/site-packages/torch/nn/modules/module.py in _apply(self, fn)
199 def _apply(self, fn):
200 for module in self.children():
--> 201 module._apply(fn)
202
203 def compute_should_use_set_data(tensor, tensor_applied):

~/anaconda3/envs/pycvenv/lib/python3.6/site-packages/torch/nn/modules/module.py in _apply(self, fn)
199 def _apply(self, fn):
200 for module in self.children():
--> 201 module._apply(fn)
202
203 def compute_should_use_set_data(tensor, tensor_applied):

~/anaconda3/envs/pycvenv/lib/python3.6/site-packages/torch/nn/modules/module.py in _apply(self, fn)
199 def _apply(self, fn):
200 for module in self.children():
--> 201 module._apply(fn)
202
203 def compute_should_use_set_data(tensor, tensor_applied):

~/anaconda3/envs/pycvenv/lib/python3.6/site-packages/torch/nn/modules/module.py in _apply(self, fn)
199 def _apply(self, fn):
200 for module in self.children():
--> 201 module._apply(fn)
202
203 def compute_should_use_set_data(tensor, tensor_applied):

~/anaconda3/envs/pycvenv/lib/python3.6/site-packages/torch/nn/modules/module.py in _apply(self, fn)
199 def _apply(self, fn):
200 for module in self.children():
--> 201 module._apply(fn)
202
203 def compute_should_use_set_data(tensor, tensor_applied):

~/anaconda3/envs/pycvenv/lib/python3.6/site-packages/torch/nn/modules/rnn.py in _apply(self, fn)
135 # Note: be v. careful before removing this, as 3rd party device types
136 # likely rely on this behavior to properly .to() modules like LSTM.
--> 137 self._flat_weights = [getattr(self, weight) for weight in self._flat_weights_names]
138
139 # Flattens params (on CUDA)

~/anaconda3/envs/pycvenv/lib/python3.6/site-packages/torch/nn/modules/module.py in __getattr__(self, name)
574 return modules[name]
575 raise AttributeError("'{}' object has no attribute '{}'".format(
--> 576 type(self).__name__, name))
577
578 def __setattr__(self, name, value):

AttributeError: 'LSTM' object has no attribute '_flat_weights_names'`

Running the above commands gave me an error mentioned above .I have searched this issue but didn't found something useful in detail. Can someone please help!

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

I ran the following code on colab (just installed flair and loaded the en-sentiment model) :-

!pip3 install flair
import flair
flair_sentiment = flair.models.TextClassifier.load('en-sentiment')

After installing, uninstalling and everything it gave me the following error:-

ValueError Traceback (most recent call last)
in ()
1 get_ipython().system('pip3 install flair')
2 import flair
----> 3 flair_sentiment = flair.models.TextClassifier.load('en-sentiment')

6 frames
/usr/local/lib/python3.6/dist-packages/flair/embeddings.py in (.0)
3000 def _apply(self, fn):
3001 major, minor, build, *_ = (int(info)
-> 3002 for info in torch.__version__.split('.'))
3003
3004 # fixed RNN change format for torch 1.4.0

ValueError: invalid literal for int() with base 10: '0+cu101'

I don't understand I didn't even have any chance to do something.

We are releasing a new version that fixes this error, probably tomorrow.

New version of Flair just released!

Was this page helpful?
0 / 5 - 0 ratings

Related issues

gopalkalpande picture gopalkalpande  路  3Comments

davidsbatista picture davidsbatista  路  3Comments

isanvicente picture isanvicente  路  3Comments

ciaochiaociao picture ciaochiaociao  路  3Comments

ChessMateK picture ChessMateK  路  3Comments