Transformers: Should be able to turn off logging

Created on 27 Feb 2020  路  11Comments  路  Source: huggingface/transformers

馃殌 Feature request

When doing a simple pipeline, I want to supress:

Downloading: 100%|鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻坾 230/230 [00:00<00:00, 136kB/s]
convert squad examples to features: 100%|鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻坾 1/1 [00:00<00:00, 241.08it/s]
add example index and unique id: 100%|鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻堚枅鈻坾 1/1 [00:00<00:00, 7037.42it/s]

Motivation

Your contribution

My code is pretty straightforward:

    args = parse_args()
    f = open(args.text_path, "r")
    context = f.read()
    # print(context)

    tokenizer = AutoTokenizer.from_pretrained(args.model)
    model = AutoModelForQuestionAnswering.from_pretrained(args.model)

    qa = pipeline('question-answering',
                  model='distilbert-base-uncased-distilled-squad', tokenizer='bert-base-cased')
    response = qa(context=context,
                  question=args.question)

    print(response['answer'])
wontfix

Most helpful comment

Here you go:

# To control logging level for various modules used in the application:
import logging
import re
def set_global_logging_level(level=logging.ERROR, prefices=[""]):
    """
    Override logging levels of different modules based on their name as a prefix.
    It needs to be invoked after the modules have been loaded so that their loggers have been initialized.

    Args:
        - level: desired level. e.g. logging.INFO. Optional. Default is logging.ERROR
        - prefices: list of one or more str prefices to match (e.g. ["transformers", "torch"]). Optional.
          Default is `[""]` to match all active loggers.
          The match is a case-sensitive `module_name.startswith(prefix)`
    """
    prefix_re = re.compile(fr'^(?:{ "|".join(prefices) })')
    for name in logging.root.manager.loggerDict:
        if re.match(prefix_re, name):
            logging.getLogger(name).setLevel(level)

Usage:

  1. override all module-specific loggers to a desired level (except whatever got logged during modules importing)
import everything, you, need
import logging
set_global_logging_level(logging.ERROR)
  1. In case of transformers you most likely need to call it as:
import transformers, torch, ...
import logging
set_global_logging_level(logging.ERROR, ["transformers", "nlp", "torch", "tensorflow", "tensorboard", "wandb"])

add/remove modules as needed.

To disable logging globally - place at the beginning of the script

import logging
logging.disable(logging.INFO) # disable INFO and DEBUG logging everywhere
# or 
# logging.disable(logging.WARNING) # disable WARNING, INFO and DEBUG logging everywhere

If desired, set_global_logging_level could be expanded to be a scope manager too.

All 11 comments

Any progress on this? Has anyone found a way to disable the logging this?

The issue appears to be tqdm. A work-around is to disable it before importing transformers:

import tqdm

def nop(it, *a, **k):
    return it
tqdm.tqdm = nop
import transformers

I agree that it's a valid requirement, we'll look into it

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

Wish this would stay open.

Logger could be really annoying when it comes to applications. Should have some way to turn it off.

Here you go:

# To control logging level for various modules used in the application:
import logging
import re
def set_global_logging_level(level=logging.ERROR, prefices=[""]):
    """
    Override logging levels of different modules based on their name as a prefix.
    It needs to be invoked after the modules have been loaded so that their loggers have been initialized.

    Args:
        - level: desired level. e.g. logging.INFO. Optional. Default is logging.ERROR
        - prefices: list of one or more str prefices to match (e.g. ["transformers", "torch"]). Optional.
          Default is `[""]` to match all active loggers.
          The match is a case-sensitive `module_name.startswith(prefix)`
    """
    prefix_re = re.compile(fr'^(?:{ "|".join(prefices) })')
    for name in logging.root.manager.loggerDict:
        if re.match(prefix_re, name):
            logging.getLogger(name).setLevel(level)

Usage:

  1. override all module-specific loggers to a desired level (except whatever got logged during modules importing)
import everything, you, need
import logging
set_global_logging_level(logging.ERROR)
  1. In case of transformers you most likely need to call it as:
import transformers, torch, ...
import logging
set_global_logging_level(logging.ERROR, ["transformers", "nlp", "torch", "tensorflow", "tensorboard", "wandb"])

add/remove modules as needed.

To disable logging globally - place at the beginning of the script

import logging
logging.disable(logging.INFO) # disable INFO and DEBUG logging everywhere
# or 
# logging.disable(logging.WARNING) # disable WARNING, INFO and DEBUG logging everywhere

If desired, set_global_logging_level could be expanded to be a scope manager too.

Will that kill tqdm? I want to keep tqdm!

Will that kill tqdm? I want to keep tqdm!

set_global_logging_level(logging.ERROR, ["transformers", "nlp", "torch", "tensorflow", "tensorboard", "wandb"])

from tqdm import tqdm
for i in tqdm(range(10000)): x = i**i

works just fine

and so does disable all:

set_global_logging_level(logging.ERROR])

from tqdm import tqdm
for i in tqdm(range(10000)): x = i**i

or in the case of "total logging silence" setting:

import logging
logging.disable(logging.INFO) # disable INFO and DEBUG logging everywhere

from tqdm import tqdm
for i in tqdm(range(10000)): x = i**i

works too.

I don't think it uses logging.

PR with the proposed code, plus adding the ability to do that during pytest https://github.com/huggingface/transformers/pull/6816

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

Was this page helpful?
0 / 5 - 0 ratings

Related issues

lemonhu picture lemonhu  路  3Comments

HanGuo97 picture HanGuo97  路  3Comments

0x01h picture 0x01h  路  3Comments

guanlongtianzi picture guanlongtianzi  路  3Comments

siddsach picture siddsach  路  3Comments