Xgboost: predict after cross-validation using xgboost [question]

Created on 1 Nov 2014  路  3Comments  路  Source: dmlc/xgboost

This is my first trial with xgboost (very fast!). But I'm a little bit confused .
In fact, I trained a model using xgb.cv as follows:
xgbmodel=xgb.cv(params=param, data=trainingdata, nrounds=100, nfold=5,showsd=T,metrics='logloss')
Now I want to predict with my test set but xgbmodel seems to be a logical value (TRUE in this case)
How could I predict after cv? Should I use xgb.train then?
HR

Most helpful comment

Yes, the xgb.cv does not return the model, but the cv history of the process. Since in cv we are training n models to evaluate the result.

A normal use case of cv is to select parameters, so usually you use cv to find a good parameter, and use xgb.train to train the model on the entire dataset

All 3 comments

Yes, the xgb.cv does not return the model, but the cv history of the process. Since in cv we are training n models to evaluate the result.

A normal use case of cv is to select parameters, so usually you use cv to find a good parameter, and use xgb.train to train the model on the entire dataset

Ok, It's more clear now

Hi,

There is a parameter prediction=TRUE in xgb.cv, which returns the prediction of cv folds. But it is not clear from the document that for which nround, the predictions are returned?

Was this page helpful?
0 / 5 - 0 ratings