This is the first time I use haartraining of opencv.
Just for practice, I used 35 positive images and 45 negative images.
But when I try to train from data, It does not finish forever,
Even when parameters are extremely adjusted.
(min hit rate = 0.001, max false alarm rate = 0.999
I don't think it would take a lot of time because of this extreme values)
What must be wrong in my experiment?
Here is my command and parameters.
$opencv_haartraining -data Training -vec samples.vec -bg negatives.dat -nstages 2 -nsplits 2 -minhitrate 0.001 -maxfalsealarm 0.999 -npos 30 -nneg 40 -w 20 -h 20 -nonsym -mem 512 -mode ALL -minpos 10
And the result.
Data dir name: Training
Vec file name: samples.vec
BG file name: negatives.dat, is a vecfile: no
Num pos: 30
Num neg: 40
Num stages: 2
Num splits: 2 (tree as weak classifier)
Mem: 512 MB
Symmetric: FALSE
Min hit rate: 0.001000
Max false alarm rate: 0.999000
Weight trimming: 0.950000
Equal weights: FALSE
Mode: ALL
Width: 20
Height: 20
Applied boosting algorithm: GAB
Error (valid only for Discrete and Real AdaBoost): misclass
Max number of splits in tree cascade: 0
Min number of positive samples per cluster: 10
Required leaf false alarm rate: 0.998001
Stage 0 loaded
Stage 1 loaded
Stage 2 loaded
Stage 3 loaded
Stage 4 loaded
Tree Classifier
Stage
+---+---+---+---+---+
| 0| 1| 2| 3| 4|
+---+---+---+---+---+
0---1---2---3---4
Number of features used : 125199
Parent node: 4
*** 1 cluster ***
POS: 30 32 0.937500
This is normal. Referring to this tutorial and my own experience, it's normal the training takes couple of days even up to a week. Quoting from the tutorial:
Am I correct in thinking that you are using OpenCV_Haartraining?
If so this is a deprecated app and you should use opencv_traincascades.
This will
See these links for further reading Training Vs TrainCascade and TrainCascade Wiki.
EDIT:
also, change your min hit rate and maxFalseAlarm rate.
I would suggest using something like 0.4 & 0.95 to get going.
reason for this is that it will take forever for it to hit 0.999 & 0.0001 if ever.