I am searching for a hyperparameter tune package for code written directly in Tensorflow (not Keras or Tflearn). Could you make some suggestion?
相关问题
- batch_dot with variable batch size in Keras
- How to use Reshape keras layer with two None dimen
- CV2 Image Error: error: (-215:Assertion failed) !s
- Why keras use “call” instead of __call__?
- How to conditionally scale values in Keras Lambda
相关文章
- tensorflow 神经网络 训练集准确度远高于验证集和测试集准确度?
- Tensorflow: device CUDA:0 not supported by XLA ser
- Optimization techniques for backtracking regex imp
- Numpy array to TFrecord
- conditional graph in tensorflow and for loop that
- How to downgrade to cuda 10.0 in arch linux?
- Apply TensorFlow Transform to transform/scale feat
- How to use cross_val_score with random_state
You can try out Ray Tune, a simple library for scaling hyperparameter search. I mainly use it for Tensorflow model training, but it's agnostic to the framework - works seamlessly with PyTorch, Keras, etc. Here's the docs page - ray.readthedocs.io/en/latest/tune.html
You can use it to run distributed versions of state-of-the-art algorithms such as HyperBand or Bayesian Optimization in about 10 lines of code.
As an example to run 4 parallel evaluations at a time:
You also don't need to change your code if you want to run this script on a cluster.
Disclaimer: I work on this project - let me know if you have any feedback!
I'd like to add one more library to @jdehesa's list, which I have applied in my research particularly with tensorflow. It's hyper-engine, Apache 2.0 licensed.
It also implements Gaussian Process Bayesian optimization and some other techniques, like learning curve prediction, which save a lot of time.
Usually you don't need to have your hyperparameter optimisation logic coupled with the optimised model (unless your hyperparemeter optimisation logic is specific to the kind of model that you are training, in which case you would need to tell us a bit more). There are several tools and packages available for the task. Here is a good paper on the topic, and here is a more practical blog post with examples.
Out of these, I have only really (that is, with a real problem) used hyperopt with TensorFlow, and it didn't took too much effort. The API is a bit weird at some points and the documentation is not terribly thorough, but it does work and seems to be under active development, with more optimization algorithms and adaptations (e.g. specifically for neural networks) possibly coming. However, as suggested in the previously linked blog post, Scikit-Optimize is probably as good, and SigOpt looks quite easy to use if it fits you.
I found sci-kit optimize very simple to use for bayesian optimization of hyperameters, and it works with any tensorflow API (estimator, custom estimator, core, keras, etc.)
https://stackoverflow.com/a/53582472/2218905
You could use variational inference (bayesian) as a point cloud over the optimization space; hyperparameter tuning would be much better. Tensorflow probability would be an approach.
I'm not sure if this is also the parameters that you want but you mentioned TensorFlow hyperparameters so I guess I can suggest some.
Try to clone this repository for having the needed scripts;
and in the Master folder, invoke your command prompt and run this line;
to get the list of optional arguments.
Source: https://codelabs.developers.google.com/codelabs/tensorflow-for-poets/#6