dask_ml.xgboost.XGBRegressor

dask_ml.xgboost.XGBRegressor

class dask_ml.xgboost.XGBRegressor(max_depth=3, learning_rate=0.1, n_estimators=100, verbosity=1, silent=None, objective='reg:linear', booster='gbtree', n_jobs=1, nthread=None, gamma=0, min_child_weight=1, max_delta_step=0, subsample=1, colsample_bytree=1, colsample_bylevel=1, colsample_bynode=1, reg_alpha=0, reg_lambda=1, scale_pos_weight=1, base_score=0.5, random_state=0, seed=None, missing=None, importance_type='gain', **kwargs)
Attributes
coef_

Coefficients property

feature_importances_

Feature importances property

intercept_

Intercept (bias) property

Methods

apply(X[, ntree_limit])

Return the predicted leaf every tree for each sample.

evals_result()

Return the evaluation results.

fit(X[, y, eval_set, sample_weight, ...])

Fit the gradient boosting model

get_booster()

Get the underlying xgboost Booster of this model.

get_num_boosting_rounds()

Gets the number of xgboost boosting rounds.

get_params([deep])

Get parameters.

get_xgb_params()

Get xgboost type parameters.

load_model(fname)

Load the model from a file.

predict(X)

Predict with data.

save_model(fname)

Save the model to a file.

score(X, y[, sample_weight])

Return the coefficient of determination of the prediction.

set_params(**params)

Set the parameters of this estimator.

__init__(max_depth=3, learning_rate=0.1, n_estimators=100, verbosity=1, silent=None, objective='reg:linear', booster='gbtree', n_jobs=1, nthread=None, gamma=0, min_child_weight=1, max_delta_step=0, subsample=1, colsample_bytree=1, colsample_bylevel=1, colsample_bynode=1, reg_alpha=0, reg_lambda=1, scale_pos_weight=1, base_score=0.5, random_state=0, seed=None, missing=None, importance_type='gain', **kwargs)