https://github.com/simonblanke/surrogate-models
A collection of surrogate models for sequence model based optimization techniques
https://github.com/simonblanke/surrogate-models
Last synced: 9 months ago
JSON representation
A collection of surrogate models for sequence model based optimization techniques
- Host: GitHub
- URL: https://github.com/simonblanke/surrogate-models
- Owner: SimonBlanke
- License: mit
- Created: 2021-02-25T09:05:48.000Z (about 5 years ago)
- Default Branch: main
- Last Pushed: 2021-05-03T18:52:38.000Z (about 5 years ago)
- Last Synced: 2025-01-12T19:21:14.393Z (over 1 year ago)
- Language: Python
- Size: 4.88 KB
- Stars: 0
- Watchers: 3
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
Surrogate Models
A collection of surrogate models (wrapper classes) for sequence model based optimization techniques used in Hyperactive and Gradient-Free-Optimizers.
## Bayesian Optimization Surrogate Models
GPy
```python
import GPy
import numpy as np
class GPySurrogateModel:
def __init__(self):
self.kernel = GPy.kern.RBF(input_dim=1)
def fit(self, X, y):
self.m = GPy.models.GPRegression(X, y, self.kernel)
self.m.optimize(messages=False)
def predict(self, X, return_std=False):
mean, std = self.m.predict(X)
if return_std:
return mean, std
return mean
```
GPflow
```python
import gpflow
import numpy as np
class GPflowSurrogateModel:
def __init__(self):
self.kernel = gpflow.kernels.Matern52()
def fit(self, X, y):
X = X.astype(np.float64)
self.m = gpflow.models.GPR(data=(X, y), kernel=self.kernel)
opt = gpflow.optimizers.Scipy()
opt.minimize(
self.m.training_loss, self.m.trainable_variables, options=dict(maxiter=100)
)
def predict(self, X, return_std=False):
X = X.astype(np.float64)
mean, std = self.m.predict_f(X)
mean = np.array(mean)
std = np.array(std)
if return_std:
return mean, std
return mean
```
Decision Tree Ensemble
```python
from sklearn.ensemble import ExtraTreesRegressor as _ExtraTreesRegressor_
def _return_std(X, trees, predictions, min_variance):
"""
used from:
https://github.com/scikit-optimize/scikit-optimize/blob/master/skopt/learning/forest.py
"""
std = np.zeros(len(X))
trees = list(trees)
for tree in trees:
if isinstance(tree, np.ndarray):
tree = tree[0]
var_tree = tree.tree_.impurity[tree.apply(X)]
var_tree[var_tree < min_variance] = min_variance
mean_tree = tree.predict(X)
std += var_tree + mean_tree ** 2
std /= len(trees)
std -= predictions ** 2.0
std[std < 0.0] = 0.0
std = std ** 0.5
return std
class ExtraTreesRegressor(_ExtraTreesRegressor_):
def __init__(self, min_variance=0.001, **kwargs):
self.min_variance = min_variance
super().__init__(**kwargs)
def fit(self, X, y):
super().fit(X, np.ravel(y))
def predict(self, X, return_std=False):
mean = super().predict(X)
if return_std:
std = _return_std(X, self.estimators_, mean, self.min_variance)
return mean, std
return mean
```