Transformers: Attempted relative import with no known parent package

Created on 26 Jun 2020  路  14Comments  路  Source: huggingface/transformers

馃悰 Bug

Information

Model I am using: Bertabs (https://github.com/huggingface/transformers/tree/master/examples/seq2seq/bertabs)

Language I am using the model on: English

The problem arises when using:
[x ] the official example scripts: (give details below)

The tasks I am working on is:

  • [x] my own task or dataset: (give details below)

To reproduce

Hi, I did as the Readme says but the following error is thrown, when I want to start the training via:

python run_summarization.py --documents_dir "data/stories" --summaries_output_dir "out" --no_cuda false --batch_size 4 --min_length 50 --max_length 200 --beam_size 5 --alpha 0.95 --block_trigram true :

File "run_summarization.py", line 15, in <module> from .utils_summarization import ( ImportError: attempted relative import with no known parent package

I am using Win10, and an Anaconda Env.

Steps to reproduce the behavior:

  1. Install a new Anaconda Env with torch
  2. Do as the Readme says
  3. Put the code into a file or start it via console.

Expected behavior

Start Training, or any error message, which I can resolve.

Environment info

Using Windows 10, and an Anaconda Env.

Thank you

bertabs

Most helpful comment

Changing line 15 in run_summarization.py from
from .utils_summarization import ( to from utils_summarization import (
fixed that error

All 14 comments

Changing line 15 in run_summarization.py from
from .utils_summarization import ( to from utils_summarization import (
fixed that error

@sshleifer - can you take a look here maybe?

I do have a similar error, when I changed from .utils_summarization import ( to from utils_summarization import ( I got another error which is:
/content/transformers/examples/seq2seq/bertabs
Traceback (most recent call last):
File "run_summarization.py", line 12, in
from modeling_bertabs import BertAbs, build_predictor
File "/content/transformers/examples/seq2seq/bertabs/modeling_bertabs.py", line 30, in
from configuration_bertabs import BertAbsConfig
File "/content/transformers/examples/seq2seq/bertabs/configuration_bertabs.py", line 19, in
from transformers import PretrainedConfig
ModuleNotFoundError: No module named 'transformers'

Can you help me please?

@Hildweig did you install the dependencies with
pip install -r requirements.txt ?

Also to make the example run I had to change the used model in run_summarization.py to
remi/bertabs-finetuned-extractive-abstractive-summarization

I did install the requirements, thank you! changing the model in run_summarization.py fixed the issue! However the output is very bad and makes no sense, this is what I got for a good text:

the u.s. is on the way to meet with the u.s.. the u.s. has pledged to help ease ease ease the situation. the u.n. is the most most most likely to provide provide provide a detailed detailed detailed detail. the u..s. is due to the u.s. , and and the u.s. will provide provide some areas to help

Did you get a good summary?

Why are you guys using bertabs?
It seems to not work very well, according to #3647 .

@MichaelJanz would you be willing to contribute a PR with your changes?

@sshleifer what do you suggest using for abstractive summarization?

@Hildweig Depends what your goal is?

For getting good scores on the cnndm dataset, I'd recommend sshleifer/distilbart-cnn-12-6 and the finetuning in examples/seq2seq/finetune.py.

For your own data, start with a cnn checkpoint if you want 3 sentence summaries and an xsum checkpoint if you want 1 sentence summaries.

For running experiments to see how summarization finetuning works, you can start from bart-large, but these experiments are slow.

To make a useful open source contribution, you could try modifying/changing hyperparameters in ./train_distilbart_cnn.sh to try to get a high score. Bonus points if you use --logger wandb_shared.

Also I recently update the setup instructions and script. It's now on master here. There are some tips in there that try to cover common cases.

Speed/Rouge tradeoffs detailed of different bart variants: here
tweet describing the distilbart project from a high level.

@Hildweig I tested different input data and it performed (qualitatively considered) well for a random book review: https://bookpage.com/reviews/25272-deb-caletti-girl-unframed-ya#.Xvmf-CgzaUm

The result was:

_sydney has to spend the summer with her mother , the once - famous movie star , lila shore , at her sumptuous mansion in san francisco 's exclusive sea cliff neighborhood. sydney has a lot of freedom to explore the city on her own ,
which is how she meets nicco and begins a relationship that will unexpectedly change all of their lives forever. in the book , sydney 's story appears to suggest that
the book itself is her testimony about the lead - up to a terrible crime. as a woman in the world , it often means being looked at but not seen by it , she says_

However, using larger texts like a Sherlock Holmes novel did not work well, without any metrics considered.

@sshleifer
Sure, I can create a PR.
Also thank you for the hint, I am thankful for any advices I can get for my master thesis.

May I ask you about your advice on my goal?
I want to build a system, that is capable of creating summaries of book reviews for instagram posts in the german language. I am thinking of using the german bert (or similar) and fine tune it on a dataset I still have to get. Do you have any advice for me, you would like to share?

However, the generated text does not look like abstractive, rather extractive

Pr created in #5355

@sshleifer thank you! My goal is to fine-tune it on duc dataset.
@MichaelJanz for me I have to summarize huge documents (17 page or so).

Hi @sshleifer and @MichaelJanz,

This seems to be a problem of python package structure. I am getting a similar error but with the token_classification/run_ner.py file.

File "examples/token-classification/run_ner.py", line 42, in
from modeling_auto import AutoModelForTokenClassification
File "transformers/src/transformers/modeling_auto.py", line 22, in
from .configuration_auto import (
ImportError: attempted relative import with no known parent package

I have not installed transformers library using pip because I want to use the local codes (cloned from transformers library). After reading various stackoverflow suggestions (https://stackoverflow.com/questions/16981921/relative-imports-in-python-3 and https://napuzba.com/a/import-error-relative-no-parent), I believe that when I am importing the transformer package locally from my own directory, then it is not able read+load transformer as a package.

I am using python3.7

Can you please suggest how to read transformer as a package from local codes.

Thanks...

Was this page helpful?
0 / 5 - 0 ratings