Before starting implementation of solution for my problem I just want to be sure if I will not “reinvent wheel” and if I can reuse work that someone have done before. So my problem is:
I have made image matcher using OpenCV library. This matcher receives a set of image files and trying to find similar images in database. At the end it returns statistical results according to ROC Curves definition (True Positive, True Negative, False Positive and False Negative number of matches). These results can vary because of OpenCV’s library algorithm parameters values, which are about 10. This means that parameters adjustment will bring to more True Positive matches and to less False Positive matches. As I have to adjust more or less 10 parameters, brute force adjuster will be very slowly. By brute force I mean:
While(param1 < 50){
While(param2 < 50){
While(param3 < 50){
…
PerformMatching();
param3 +=2;
}
param2++;
}
pram1 +=5;
}
What I want to do is to choose parameters randomly and then analyze if statistical results are becoming better. Then this analysis will help to alter method which is randomly generates parameters in order to choose better parameters.
So my question is if there is library in Java or if there is any AI algorithm, which will return better parameter set based on evaluation of True Positive and False Positive values?
Could appreciate some help, thanks.
There is a range of techniques for algorithm optimization, from the simple grid search in your example to various adaptive algorithms. If you want to learn about the more advanced techniques, I recommend starting by looking at Frank Hutter's research. His PhD thesis contains a great overview of the field.
If you don't want to spend too much time, one big improvement you could do is to generate your parameters randomly, instead of using a regular grid. This way, if some parameters turn out to be unimportant, you won't have wasted time keeping the other ones fixed. You want something like:
Another advantage of this approach is that you can collect sample points for as long as you want and not have to wait until the whole grid is explored. You can do still better by using a quasi-random parameter sequence, which will keep the points evenly spread out. There are libraries that will generate these for you.
Once you have generated your points, you could simply choose the best combination, or analyze them using graphing tools or use a mode-finding algorithm, such as MeanShift.
Hill Climbing
You could try some stochastic optimization algorithms, e.g., Hill Climbing, in which you start with a random solution (like @Don Reba pointed out) and look at the set of neighboring solutions for those that are better with respective to the cost function. I will use some sample python code to explain the idea.
Get neighbor solutions
For neighbors in your case, you can use a simple function like:
If you have a current solution of [1,3,4,2,2], by incrementing or decrementing any of the components, you get 10 different neighbors as below:
Here we assume every parameter is an integer. You could achieve more granularity by adjusting the step size (e.g., 0.05).
Iterations of hill climbing
Cost function
For completeness, I will use my own cost function (just for a demo purpose), which is
That is :
Optimization result:
Here is the result. We use a random initial guess of [4, 0, 1, 3, 1]. After 14 steps (evaluation of 14*10 = 140 neighbors), we find the optimal answer of [0, 5, 0, 5, 0] which minimize the cost. For brute force, you have to evaluate 6^6 = 46656 solutions. Much more running time could be saved while you have a high dimensional solution.
Note since this is a stochastic method, local minimum is found as the final result (though sometimes it's identical to the global minimum, it's not guaranteed). However practically it's good enough.
Related post
A related but more complex question is here: Algorithm for selecting n vectors out of a set while minimizing cost
All codes above can be found here.