Is it possible to tune hyperparameters using ML Engine to train the model locally? The documentation only mentions training with hyperparameter tuning in the cloud (submitting a job), and has no mention to doing so locally.
Otherwise, is there another commonly used hyperparameter tuning that passes in command arguments to task.py as in the census estimator tutorial?
https://github.com/GoogleCloudPlatform/cloudml-samples/tree/master/census
You cannot perform HPTuning (Bayesian Optimization based HPTuning which Cloud ML Engine supports) locally, since it's a managed service which Cloud ML Engine offers. There are other ways to perform Hyperparameter tuning e.g., Scikit-learn GridSearch but they are far less effective in this task.
As Puneith said, hyperparamater tuning cannot run locally in ML-Engine.
SciKit Optimize provides an easy to use wrapper that works with any model including estimators. Just put the code that runs training for N epochs into its own function, which returns the evaluation 1-accuracy, 1-auroc or loss metric for minimizing.
import numpy as np
from skopt import gp_minimize
def train(hyperparam_config):
# set from passed in hyperparameters
learning_rate = hyperparam_config[0]
num_layers = hyperparam_config[2]
# run training
res = estimator.train_and_evaluate()...
return res['loss'] # return metric to minimize
hyperparam_config = [Real(0.0001, 0.01, name="learning_rate"),
Integer(3, 10, name="num_layers")]
res = gp_minimize(train, hyperparam_config)
with open('results.txt', 'w') as wf:
wf.write(str(res))
print(res)
Source:
https://github.com/scikit-optimize/scikit-optimize/blob/master/examples/hyperparameter-optimization.ipynb
Check Sherpa, excellent Hyperparameter optimization library.
It says:
Hyperparameter optimization that enables researchers to experiment, visualize, and scale quickly
There are many HyperParameter optimization libraries out there, but using Sherpa one can visualize the results.