Operating System: Ubuntu16.04
CPU:Intel
Python version: Python 2.7
Could you tell me what's the mean of this message?
[LightGBM] [Info] No further splits with positive gain, best gain: -inf
[LightGBM] [Info] Trained a tree with leaves=58 and max_depth=14
"param": {
"task":"train",
"boosting":"gbdt",
"application":"binary",
"num_leaves":63,
"min_data_in_leaf":1000,
"learning_rate":0.1,
"feature_fraction":1.0,
"bagging_freq":50,
"bagging_fraction":0.8
}
@zyoohv
it means the learning of tree in current iteration should be stop, due to cannot split any more.
I think this is caused by "min_data_in_leaf":1000, you can set it to a smaller value.
@guolinke
Thank you very much. I changed it according your recommend. And now it works! :)
@zyoohv
This is not a bug, it is a feature.
The output message is to warn user that your parameters may be wrong, or your dataset is not easy to learn.
And how to disable this 'feature', if it's possible :)?
you can shut up the warning by set verbose
for sklearn interface, you can set verbose=-1 when defining the model (not in fit).
for lgb.train interface, you can set verbose=-1 in param dict.
I also meet this problem, but can not deal with it using the solution referenced in issue640
@xuwiliam it should work if you on the latest master branch.
If it still doesn't work, you can provide a re-produce code with random generated input data.
Hi there, the problem still occurs even with the latest master.
Using lgb.train and verbose set to -1 you don't get the warning if and only if you don't provide an argument for valid_sets. If valid_sets is provided you can set verbose or verbosity to whatever value the warning will be displayed.
Hope this helps in finding the issue.
set lgb.LGBMRegressor(verbose=-1), it works!
Most helpful comment
@zyoohv
it means the learning of tree in current iteration should be stop, due to cannot split any more.
I think this is caused by
"min_data_in_leaf":1000, you can set it to a smaller value.