I have a Sudoku solving algorithm for which my goal is to make as fast as possible. To test this algorithm, I run it multiple times and calculate the average. After noticing some weird numbers, I decided to print all times and got this result:
Execution Time : 4.257746 ms (#1)
Execution Time : 7.610686 ms (#2)
Execution Time : 6.277609 ms (#3)
Execution Time : 7.595707 ms (#4)
Execution Time : 7.610131 ms (#5)
Execution Time : 5.011104 ms (#6)
Execution Time : 3.970937 ms (#7)
Execution Time : 3.923783 ms (#8)
Execution Time : 4.070238 ms (#9)
Execution Time : 4.765347 ms (#10)
Execution Time : 0.818264 ms (#11)
Execution Time : 0.620216 ms (#12)
Execution Time : 0.679021 ms (#13)
Execution Time : 0.643516 ms (#14)
Execution Time : 0.718408 ms (#15)
Execution Time : 0.744481 ms (#16)
Execution Time : 0.760569 ms (#17)
Execution Time : 0.80384 ms (#18)
Execution Time : 0.75946 ms (#19)
Execution Time : 0.802176 ms (#20)
Execution Time : 66.032508 ms : average = 3.3016254000000003
After 10 to 15 executions (it varies randomly), the performance of the algorithm is drastically improves. If I run it several hundred times, it will eventually steady at around 0.3ms. Note that I run the algorithm once before this loop for JIT to do it's thing.
Furthermore, if I make the thread sleep for 2 seconds before running my loop, all my times are at 1ms (+/- 0.2).
Furthermore, if I solved a generic Sudoku (a grid with 1-9 diagonally) some 500 times before my loop, all my times are around 0.3ms (+/- 0.02).
Every solve is identical. All values are reset.
So my question is many-fold:
-Why does each solve time improve after consecutive solves?
-Why do I have a sudden drop in solve time after 10-15 solves?