Lightgbm: Suppress LightGBM Warning

Created on 30 Dec 2017  路  26Comments  路  Source: microsoft/LightGBM

The following line is replicated throughout the training for each iteration and it doesn't seem to be generating through Python's standard (warnings)[https://docs.python.org/3/library/warnings.html] module:

[LightGBM] [Warning] No further splits with positive gain, best gain: -inf

How can I suppress this warning? It is generated here. Also, if possible, can you tell me the meaning of this warning?

bug

Most helpful comment

for sklearn interface, you can set verbose=-1 when defining the model (not in fit).
for lgb.train interface, you can set verbose=-1 in param dict.

All 26 comments

it means:

  1. the num_leaves is too large, you can set it to a smaller value
  2. the min_data is too large
  3. your data is hard to fit

thanks, how do you suppress these warnings and keep reporting the validation metrics using verbose_eval?

for sklearn interface, you can set verbose=-1 when defining the model (not in fit).
for lgb.train interface, you can set verbose=-1 in param dict.

@guolinke what abour lgb.cv ? Can I suppress this warning in lgb.cv?
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf

I think it can use in cv as well

For cv, no parameter like "verbose". "verbose_eval" is there. But setting
it to -1 doesn't solve the problem.

I temporarily solved it by decreasing the num_leaves parameter.

set it in param dict, not the function arguments.

setting 'verbose' or 'verbosity' to -1 in the param dict solves this problem for lgb.train,
but it does not help for lgb.cv, or for lgb.train with continued training (i.e. with init_model)

ping @StrikerRUS

@guolinke I confirm that with init_model setting verbose=-1 doesn't work. However, in cv verbose=-1 works well for me.

UPD:
It seems that problem persists only in case of both init_model and valid_sets are specified:

Logs are shown:

import lightgbm as lgb
from sklearn.datasets import load_boston
from sklearn.model_selection import train_test_split

X, y = load_boston(True)
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.5, random_state=42)
params = {'verbose': -1}

lgb_train = lgb.Dataset(X_train, y_train, params=params, free_raw_data=False)
lgb_eval = lgb.Dataset(X_test, y_test, params=params, free_raw_data=False)
init_gbm = lgb.train(params, lgb_eval)
gbm = lgb.train(params, lgb_train,
                valid_sets=lgb_eval,
                verbose_eval=False,
                init_model=init_gbm)

No logs:

import lightgbm as lgb
from sklearn.datasets import load_boston
from sklearn.model_selection import train_test_split

X, y = load_boston(True)
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.5, random_state=42)
params = {'verbose': -1}

lgb_train = lgb.Dataset(X_train, y_train, params=params, free_raw_data=False)
lgb_eval = lgb.Dataset(X_test, y_test, params=params, free_raw_data=False)
init_gbm = lgb.train(params, lgb_eval)
gbm = lgb.train(params, lgb_train,
#                 valid_sets=lgb_eval,
                verbose_eval=False,
                init_model=init_gbm)

Also, instance of valid_sets is matter (the same as training set):
no logs:

import lightgbm as lgb
from sklearn.datasets import load_boston
from sklearn.model_selection import train_test_split

X, y = load_boston(True)
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.5, random_state=42)
params = {'verbose': -1}

lgb_train = lgb.Dataset(X_train, y_train, params=params, free_raw_data=False)
lgb_eval = lgb.Dataset(X_test, y_test, params=params, free_raw_data=False)
init_gbm = lgb.train(params, lgb_eval)
gbm = lgb.train(params, lgb_train,
                valid_sets=lgb_train,  # <-------
                verbose_eval=False,
                init_model=init_gbm)

@StrikerRUS
what logs are shown ?
Including the training information?

I think we need to pass verbose parameter when creating _InnerPredictor:
https://github.com/Microsoft/LightGBM/blob/master/python-package/lightgbm/engine.py#L113 .

@guolinke

what logs are shown ?

Only warnings:

[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf

In the example init_model is a Booster, so this line?
https://github.com/Microsoft/LightGBM/blob/c86fe61b7e1fa15d0cdd8e69beef269660ec88c8/python-package/lightgbm/engine.py#L115

Hi there, I copy the content of issue #1486 as requested by @StrikerRUS

Environment info

Operating System: Linux and windows (can't test on Apple)
C++/Python/R version: latest version (I believe Kaggle uses the latest master branch), occurs also on 2.1.0

Reproducible examples

https://www.kaggle.com/ogrellier/lighgbm-with-selected-features

The problem only occurs with lgb.train (LGBClassifier does not exhibit the same issue) and only if eval_sets argument is provided.

image

Let me know if you need any further info.
Thanks, Olivier

Just to recap :

  • Sklearn API works correctly
  • lgb.train has the problem if an eval_sets is provided whatever verbose, verbosity or verbose_eval

Hope this helps.

[LightGBM] [Warning] boosting is set=gbdt, boosting_type=gbdt will be ignored. Current value: boosting=gbdt
[LightGBM] [Warning] num_threads is set=4, nthread=-1 will be ignored. Current value: num_threads=4

With the sklearn API I still get during fit the above types of messages even of False is passed to fit() and verbose=-1 passed as dict into the initial class part of a dict.

Note this is even though I am not passing (ever) boosting_type or nthread.

But if I do model.get_params() I see those extra 2 parameters listed there even though I never passed them. So I assume sklearn API is adding them and then lgbm complains.

@pseudotensor These parameters are regular arguments of the constructor, you should use them instead of aliases in params.

yes thanks, just bit odd these main sklearn ones are listed as aliases

I will really appreciate if everyone here will test the fix proposed in #1628 (verbose branch) and report here whether it helped in your case.

@goldentom42 Speaking personally about your case, passing params={'verbose': -1} in Dataset constructor (UPD: and removing silent=True) should help even without the fix.

Sorry but I'm not sure I understand your statement: I should remove silent=True in the lgb.Dataset() call and set verbose: - 1 when calling lgb.train ?

@goldentom42 Please see the example:

lgb_train = lgb.Dataset(X_train, y_train, params={'verbose': -1}, free_raw_data=False)
lgb_eval = lgb.Dataset(X_test, y_test, params={'verbose': -1}, free_raw_data=False)
gbm = lgb.train({'verbose': -1}, lgb_train,
                valid_sets=lgb_eval,
                verbose_eval=False)

verbose=-1 in both Dataset constructor and train function.

Hi @StrikerRUS, tested LightGBM on Kaggle (they would normally have the latest version) and I don't see the warnings anymore with verbose : -1 in params.

On LightGBM 2.1.2, setting verbose to -1 in both Dataset and lightgbm params make warnings disappear.

Hope this helps.

@goldentom42 Thanks for your reply! It seems to be that Kaggle uses the latest available release at PyPI, not the master branch:
https://github.com/Kaggle/docker-python/blob/cd1e6ac7d076775af2a5bfbcb65bfd98ea6629de/Dockerfile#L78

We are going to merge the fix into the master branch and close this issue. So, the latest code should work fine with either silent=True argument, or 'verbose': -1 parameter.
However, feel free to report any cases which left uncovered by the fix.

I still get
[LightGBM] [Warning] num_threads is set=1, n_jobs=-1 will be ignored. Current value: num_threads=1
because of conflicting options. Not sure why certain parameters override other parameters and warn you about it.

@joseortiz3 We use alphabet order to override the parameters, therefore, num_therads will override n_jobs

thanks for the message in this issue.

In a word, Could we understand it as the model would be run OK although such message printed out.

image
verbose=-1. I also tried all mentioned above and all that I found in docs. It's too annoying.

Was this page helpful?
0 / 5 - 0 ratings