Is there a way in keras
or tensorflow
to give samples an extra weight if they are incorrectly classified only. Ie. a combination of class weight and sample weight but only apply the sample weight for one of the outcomes in a binary class?
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试):
问题:
回答1:
Yes, it's possible. Below you may find an example of how to add additional weight on true positives , false positives , true negatives, etc:
def reweight(y_true, y_pred, tp_weight=0.2, tn_weight=0.2, fp_weight=1.2, fn_weight=1.2):
# Get predictions
y_pred_classes = K.greater_equal(y_pred, 0.5)
y_pred_classes_float = K.cast(y_pred_classes, K.floatx())
# Get misclassified examples
wrongly_classified = K.not_equal(y_true, y_pred_classes_float)
wrongly_classified_float = K.cast(wrongly_classified, K.floatx())
# Get correctly classified examples
correctly_classified = K.equal(y_true, y_pred_classes_float)
correctly_classified_float = K.cast(wrongly_classified, K.floatx())
# Get tp, fp, tn, fn
tp = correctly_classified_float * y_true
tn = correctly_classified_float * (1 - y_true)
fp = wrongly_classified_float * y_true
fn = wrongly_classified_float * (1 - y_true)
# Get weights
weight_tensor = tp_weight * tp + fp_weight * fp + tn_weight * tn + fn_weight * fn
loss = K.binary_crossentropy(y_true, y_pred)
weighted_loss = loss * weight_tensor
return weighted_loss