Employ Decision Values for Soft-Classifier Evaluation with Crispy References

L Zhu, T Ban, T Takahashi, D Inoue - … 13-16, 2018, Proceedings, Part IV 25, 2018 - Springer
Neural Information Processing: 25th International Conference, ICONIP 2018 …, 2018Springer
Abstract Evaluation of classification performance has been comprehensively studied for both
crispy and fuzzy classification tasks. In this paper, we address the hybrid case: evaluating
fuzzy prediction results against crispy references. The proposal is motivated by the following
facts:(1) most datasets in practice are produced with crispy labels due to the excessive cost
of fuzzy labelling; and (2) many state-of-the-art classifiers can yield fuzzy decision values
even if they are trained from data with crispy labels. We derive our fuzzy-crispy evaluation …
Abstract
Evaluation of classification performance has been comprehensively studied for both crispy and fuzzy classification tasks. In this paper, we address the hybrid case: evaluating fuzzy prediction results against crispy references. The proposal is motivated by the following facts: (1) most datasets in practice are produced with crispy labels due to the excessive cost of fuzzy labelling; and (2) many state-of-the-art classifiers can yield fuzzy decision values even if they are trained from data with crispy labels. We derive our fuzzy-crispy evaluation criterion based on a widely adopted fuzzy-set-based evaluation method. By exploiting the distribution of decision values, the proposed criterion bears more comprehensive information than conventional crispy classification evaluation criteria. The advantages of the proposed criterion are demonstrated in artificial and real-world classification case studies.
Springer
Showing the best result for this search. See all results