pyGPGO.surrogates.BoostedTrees module

class pyGPGO.surrogates.BoostedTrees.BoostedTrees(q1=0.16, q2=0.84, **params)[source]

Bases: object

Gradient boosted trees as surrogate model for Bayesian Optimization. Uses quantile regression for an estimate of the ‘posterior’ variance. In practice, the std is computed as (q2 - q1) / 2. Relies on sklearn.ensemble.GradientBoostingRegressor

Parameters:
  • q1 (float) – First quantile.
  • q2 (float) – Second quantile
  • params (tuple) – Extra parameters to pass to GradientBoostingRegressor
__init__(q1=0.16, q2=0.84, **params)[source]

Gradient boosted trees as surrogate model for Bayesian Optimization. Uses quantile regression for an estimate of the ‘posterior’ variance. In practice, the std is computed as (q2 - q1) / 2. Relies on sklearn.ensemble.GradientBoostingRegressor

Parameters:
  • q1 (float) – First quantile.
  • q2 (float) – Second quantile
  • params (tuple) – Extra parameters to pass to GradientBoostingRegressor
fit(X, y)[source]

Fit a GBM model to data X and targets y.

Parameters:
  • X (array-like) – Input values.
  • y (array-like) – Target values.
predict(Xstar, return_std=True)[source]

Predicts ‘posterior’ mean and variance for the GBM model.

Parameters:
  • Xstar (array-like) – Input values.
  • return_std (bool, optional) – Whether to return posterior variance estimates. Default is True.
  • eps (float, optional) – Floating precision value for negative variance estimates. Default is 1e-6
Returns:

  • array-like – Posterior predicted mean.
  • array-like – Posterior predicted std

update(xnew, ynew)[source]

Updates the internal RF model with observations xnew and targets ynew.

Parameters:
  • xnew (array-like) – New observations.
  • ynew (array-like) – New targets.