The Misclassification Likelihood Matrix: Some Classes Are More Likely To Be Misclassified Than Others

Abstract
This study introduces the Misclassification Likelihood Matrix (MLM) as a novel tool for quantifying the reliability of neural network predictions under distribution shifts. The MLM is obtained by leveraging softmax outputs and clustering techniques to measure the distances between the predictions of a trained neural network and class centroids. By analyzing these distances, the MLM provides a comprehensive view of the model's misclassification tendencies, enabling decision-makers to identify the most common and critical sources of errors. The MLM allows for the prioritization of model improvements and the establishment of decision thresholds based on acceptable risk levels. The approach is evaluated on the MNIST dataset using a Convolutional Neural Network (CNN) and a perturbed version of the dataset to simulate distribution shifts. The results demonstrate the effectiveness of the MLM in assessing the reliability of predictions and highlight its potential in enhancing the interpretability and risk mitigation capabilities of neural networks. The implications of this work extend beyond image classification, with ongoing applications in autonomous systems, such as self-driving cars, to improve the safety and reliability of decision-making in complex, real-world environments.
Description

CCS Concepts: Computing methodologies → Machine learning; Computer vision

        
@inproceedings{
10.2312:cgvc.20241239
, booktitle = {
Computer Graphics and Visual Computing (CGVC)
}, editor = {
Hunter, David
and
Slingsby, Aidan
}, title = {{
The Misclassification Likelihood Matrix: Some Classes Are More Likely To Be Misclassified Than Others
}}, author = {
Sikar, Daniel
and
Garcez, Artur d'Avila
and
Bloomfield, Robin
and
Weyde, Tillman
and
Peeroo, Kaleem
and
Singh, Naman
and
Hutchinson, Maeve
and
Laksono, Dany
and
Reljan-Delaney, Mirela
}, year = {
2024
}, publisher = {
The Eurographics Association
}, ISBN = {
978-3-03868-249-3
}, DOI = {
10.2312/cgvc.20241239
} }
Citation