Is it possible to tune hyperparameters using ML Engine to train the model locally? The documentation only mentions training with hyperparameter tuning in the cloud (submitting a job), and has no mention to doing so locally.
Otherwise, is there another commonly used hyperparameter tuning that passes in command arguments to task.py as in the census estimator tutorial?
https://github.com/GoogleCloudPlatform/cloudml-samples/tree/master/census
As Puneith said, hyperparamater tuning cannot run locally in ML-Engine.
SciKit Optimize provides an easy to use wrapper that works with any model including estimators. Just put the code that runs training for N epochs into its own function, which returns the evaluation 1-accuracy, 1-auroc or loss metric for minimizing.
Source: https://github.com/scikit-optimize/scikit-optimize/blob/master/examples/hyperparameter-optimization.ipynb
Check Sherpa, excellent Hyperparameter optimization library.
It says:
There are many HyperParameter optimization libraries out there, but using Sherpa one can visualize the results.
You cannot perform HPTuning (Bayesian Optimization based HPTuning which Cloud ML Engine supports) locally, since it's a managed service which Cloud ML Engine offers. There are other ways to perform Hyperparameter tuning e.g., Scikit-learn GridSearch but they are far less effective in this task.