I'm trying to implement the approach described in this paper, which suggests doing time series regression by first fitting a simple AR model, and then training a GBM to correct for the AR model's errors.
This strikes me as a good idea because a tree-based model needs a great deal of splits in order to approximate a linear function.
If the loss function is L2 or L1, then one can simply fit the AR model first, then train a Regressor on the AR model's residuals.
Unfortunately, this isn't so simple with a loss like Poisson. I would need the true target value, the AR model's prediction, and the LightGBM Regressor's prediction for the AR model's error. This seems difficult because a custom loss must have the signature objective(y_true, y_pred) -> grad, hess.
I'd like to know if it's possible to do this with a custom loss function in the Python API.
Is it possible to store additional information in y_true? I've thought of a possible hack: store the AR model's predictions in the weights argument of Dataset, but it looks like I'm not allowed to access those within the custom loss function.
Thanks so much!
refer to init_score .
Oh my god, I feel like such an idiot for having missed this. Thank you!
@John-Curcio not at all!
It's a well-written issue and I expect other people will find it from search engines in the future馃榾
Thanks for using LightGBM!
Most helpful comment
@John-Curcio not at all!
It's a well-written issue and I expect other people will find it from search engines in the future馃榾
Thanks for using
LightGBM!