I want to optimize an algorithm that has several variable parameters as input.
For machine learning tasks, Sklearn
offers the optimization of hyperparameters with the gridsearch
functionality.
Is there a standardized way / library in Python that allows the optimization of hyperparameters that is not limited to machine learning topics?
You might consider scipy's optimize.brute, which essentially is the same, although not that constrained in terms of API-usage. You will just have to define a function, which returns a scalar.
Shameless example-copy from the docs:
Code
Out
Brute-force functions are not much black-magic and often one might consider an own implementation. The scipy-example above has one more interesting feature:
which i would recommend for most use-cases (in continuous-space). But be sure to get some minimal understanding what this is doing to understand there are use-cases where you don't want to do this (discrete-space results needed; slow function-evaluation).
If you are using sklearn, you already have scipy installed (it's a dependency).
Edit: here some small plot i created (code) to show what
finish
is doing (local-opt) with an 1d-example (not the best example, but easier to plot):I propose:
Find detailed documentation here.
You can create a custom pipeline/estimator ( see link http://scikit-learn.org/dev/developers/contributing.html#rolling-your-own-estimator) with a score method to compare the results.
The ParameterGrid might help you too. It will automatically populated all the hyper-parameters settings.