I am building a binary classification model in H2O with Python. My 'y' values are 'ok' and 'bad'. I need the metrics to be computed with ok = negative class = 0 and bad = positive class = 1. However, I do not see any way to set this in H2O. For example here is the output of the predictions and confusion matrix:
confusion matrix
bad ok Error Rate
bad 3859 631 0.1405 (631.0/4490.0)
ok 477 1069 0.3085 (477.0/1546.0)
Total 4336 1700 0.1836 (1108.0/6036.0)
>>> predictions.head(10)
predict bad ok
0 bad 0.100604 0.899396
1 bad 0.100604 0.899396
2 bad 0.112232 0.887768
3 ok 0.068917 0.931083
4 ok 0.089706 0.910294
5 ok 0.089706 0.910294
6 ok 0.089706 0.910294
7 bad 0.126182 0.873818
8 bad 0.126182 0.873818
9 ok 0.092306 0.907694
H2O seems to arbitrarily decide based on alphabetical order among the labels. If I change the labels to 'ok' and 'sad' here is what I get:
confusion matrix
ok sad Error Rate
ok 798 732 0.4784 (732.0/1530.0)
sad 211 4381 0.0459 (211.0/4592.0)
Total 1009 5113 0.1540 (943.0/6122.0)
>>> predictions.head(10)
predict ok sad
0 sad 0.215206 0.784794
1 sad 0.211073 0.788927
2 sad 0.211073 0.788927
3 ok 0.236190 0.763810
4 ok 0.241641 0.758359
5 ok 0.241641 0.758359
6 ok 0.236099 0.763901
7 sad 0.162072 0.837928
8 sad 0.162072 0.837928
9 sad 0.206146 0.793854
There must be a way to programmatically set which label is the positive class and which is the negative class?