dask_ml.xgboost.XGBRegressor

dask_ml.xgboost.XGBRegressor

class dask_ml.xgboost.XGBRegressor(*, objective: Optional[Union[str, Callable[[numpy.ndarray, numpy.ndarray], Tuple[numpy.ndarray, numpy.ndarray]]]] = 'reg:squarederror', **kwargs: Any)
Attributes
best_iteration
best_ntree_limit
best_score
coef_

Coefficients property

feature_importances_

Feature importances property, return depends on importance_type parameter.

intercept_

Intercept (bias) property

n_features_in_

Methods

apply(X[, ntree_limit, iteration_range])

Return the predicted leaf every tree for each sample.

evals_result()

Return the evaluation results.

fit(X[, y, eval_set, sample_weight, ...])

Fit the gradient boosting model

get_booster()

Get the underlying xgboost Booster of this model.

get_num_boosting_rounds()

Gets the number of xgboost boosting rounds.

get_params([deep])

Get parameters.

get_xgb_params()

Get xgboost specific parameters.

load_model(fname)

Load the model from a file or bytearray.

predict(X)

Predict with X.

save_model(fname)

Save the model to a file.

score(X, y[, sample_weight])

Return the coefficient of determination of the prediction.

set_params(**params)

Set the parameters of this estimator.

__init__(*, objective: Optional[Union[str, Callable[[numpy.ndarray, numpy.ndarray], Tuple[numpy.ndarray, numpy.ndarray]]]] = 'reg:squarederror', **kwargs: Any) None