我正在使用 Keras
(使用 Tensorflow
后端)进行二元分类,我得到了大约 76% 的准确率和 70% 的召回率。现在我想尝试使用决策阈值。据我所知,Keras
使用决策阈值 0.5。 Keras
中有没有办法使用自定义阈值来提高决策精度和召回率?
感谢您的宝贵时间!
最佳答案
像这样创建自定义指标:
由@Marcin 编辑:创建以threshold_value
作为参数返回所需指标的函数
def precision_threshold(threshold=0.5):
def precision(y_true, y_pred):
"""Precision metric.
Computes the precision over the whole batch using threshold_value.
"""
threshold_value = threshold
# Adaptation of the "round()" used before to get the predictions. Clipping to make sure that the predicted raw values are between 0 and 1.
y_pred = K.cast(K.greater(K.clip(y_pred, 0, 1), threshold_value), K.floatx())
# Compute the number of true positives. Rounding in prevention to make sure we have an integer.
true_positives = K.round(K.sum(K.clip(y_true * y_pred, 0, 1)))
# count the predicted positives
predicted_positives = K.sum(y_pred)
# Get the precision ratio
precision_ratio = true_positives / (predicted_positives + K.epsilon())
return precision_ratio
return precision
def recall_threshold(threshold = 0.5):
def recall(y_true, y_pred):
"""Recall metric.
Computes the recall over the whole batch using threshold_value.
"""
threshold_value = threshold
# Adaptation of the "round()" used before to get the predictions. Clipping to make sure that the predicted raw values are between 0 and 1.
y_pred = K.cast(K.greater(K.clip(y_pred, 0, 1), threshold_value), K.floatx())
# Compute the number of true positives. Rounding in prevention to make sure we have an integer.
true_positives = K.round(K.sum(K.clip(y_true * y_pred, 0, 1)))
# Compute the number of positive targets.
possible_positives = K.sum(K.clip(y_true, 0, 1))
recall_ratio = true_positives / (possible_positives + K.epsilon())
return recall_ratio
return recall
现在你可以使用它们了
model.compile(..., metrics = [precision_threshold(0.1), precision_threshold(0.2),precision_threshold(0.8), recall_threshold(0.2,...)])
希望对您有所帮助:)
关于python - 用于精度和召回的 Keras 自定义决策阈值,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/42606207/