XGBoost scikit-learn API, allowing, for example, probabilistic regression #Pip install xgboost fullXGBDistribution offers the full set of XGBoost features available in the Please see the experiments page in the documentation for detailed results across Is 30x faster (timed on both fit and predict steps): On negative log-likelihood of a normal distribution and the RMSE), XGBDistribution We note that while the performance of the two models is essentially identical (measured Gradients to estimate the parameters of the distribution.īelow, we show a performance comparison of XGBDistribution with the NGBoost NGBRegressor, using the Boston Housing dataset, estimating normal distributions. XGBDistribution follows the method shown in the NGBoost library, using natural Note that this returned a namedtuple of numpy arrays for each parameter of theĭistribution (we use the scipy stats naming conventions for the parameters, see e.g. fit ( X_train, y_train, eval_set = )Īfter fitting, we can predict the parameters of the distribution: preds = model. target X_train, X_test, y_train, y_test = train_test_split ( X, y ) model = XGBDistribution ( distribution = "normal", n_estimators = 500, early_stopping_rounds = 10 ) model. XGBDistribution follows the XGBoost scikit-learn API, with an additional keywordĪrgument specifying the distribution (see the documentation for a full list ofĪvailable distributions): from sklearn.datasets import load_boston from sklearn.model_selection import train_test_split from xgboost_distribution import XGBDistribution data = load_boston () X, y = data. Installation $ pip install xgboost-distribution Usage Like NGBoost, but faster, and in the XGBoost scikit-learn API.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |