Flair: can't assign a list to a torch.cuda.LongTensor

Created on 12 Feb 2019  路  8Comments  路  Source: flairNLP/flair

I recently ran into the following issue when I was training a sequence tagger. Does anyone know why this is happening? I use the code in tutorial.

image

Thanks!

bug

All 8 comments

Hi @gccome which version of Flair are you using, and which embeddings? Could you paste the code that you use when you get this error?

I've seen this error while playing around with ELMo Transformer yesterday. It worked before #459, so @gccome could you try to go back to f4a5033a56c0182e73890b9b7247e1e249ad8282 and I think this error message won't appear?

I wasn't able to reproduce it on another (public) dataset, but I'm still searching for a step-by-step example to reproduce.

Hi @alanakbik I was using the most up-to-date Flair code on this repo. For embedding, I tried various combinations, but even the simplest one (only characterembedding) didn't work. Below is my code:

Hi @stefan-it Thanks for the tip. I went back to #450 and the error disappeared. But does it mean I won't be able to use the new release?

    # 1. load dataset
    corpus = NLPTaskDataFetcher.load_column_corpus(data_folder=data_dir,
                                                   column_format=column_format,
                                                   train_file=train_filename,
                                                   dev_file=val_filename,
                                                   test_file=test_filename)
    logger.info("Corpus info - {}".format(str(corpus)))

    # 2. what tag do we want to predict?
    tag_type = 'ner'

    # 3. make the tag dictionary from the corpus
    tag_dictionary = corpus.make_tag_dictionary(tag_type=tag_type)
    logger.info("Tag info - {}".format(str(tag_dictionary.idx2item)))

    # 4. initialize embeddings
    embedding_types = [
        CharacterEmbeddings()
    ]

   #  This is my customized embedding, already converted to Gensim format
    if custom_embedding_paths is not None:
        custom_embeddings = [WordEmbeddings(custom_embedding_path) for custom_embedding_path in custom_embedding_paths]
        embedding_types += custom_embeddings

    embeddings = StackedEmbeddings(embeddings=embedding_types)

    # 5. initialize sequence tagger

    tagger = SequenceTagger(hidden_size=hidden_size,
                            embeddings=embeddings,
                            tag_dictionary=tag_dictionary,
                            tag_type=tag_type,
                            use_crf=use_crf,
                            use_rnn=True,
                            rnn_layers=rnn_layers,
                            dropout=0.0,
                            word_dropout=0.05,
                            locked_dropout=0.5
                            )

    # 6. initialize trainer

    trainer = ModelTrainer(tagger, corpus, optimizer=Adam)

    # 7. start training
    training_hist = trainer.train(resource_dir,
                                  learning_rate=learning_rate,
                                  mini_batch_size=mini_batch_size,
                                  eval_mini_batch_size=eval_mini_batch_size,
                                  max_epochs=max_epochs,
                                  anneal_factor=0.5,
                                  patience=patience,
                                  anneal_against_train_loss=False,
                                  train_with_dev=False,
                                  monitor_train=True,
                                  embeddings_in_memory=True,
                                  checkpoint=False,
                                  save_final_model=True,
                                  anneal_with_restarts=False,
                                  test_mode=False,
                                  param_selection_mode=False)

Thanks! Could it be that the error comes from the CharacterEmbeddings class? What if you try a setup with out character embeddings?

@gccome should be fixed now on master - could you check?

Let me check and update you soon.

Hi @alanakbik , thanks for the quick fix! The error disappeared.

Great, thanks for reporting this!

Was this page helpful?
0 / 5 - 0 ratings

Related issues

Aditya715 picture Aditya715  路  3Comments

mittalsuraj18 picture mittalsuraj18  路  3Comments

ciaochiaociao picture ciaochiaociao  路  3Comments

alanakbik picture alanakbik  路  3Comments

inyukwo1 picture inyukwo1  路  3Comments