Skip to main content

Noofa Hammad

  • BSc (Effat University, 2020)

Notice of the Final Oral Examination for the Degree of Master of Applied Science

Topic

An Evidential Deep Learning Classifier with an Integrated Capability for Uncertainty Quantification

Department of Electrical and Computer Engineering

Date & location

  • Tuesday, April 9, 2024

  • 10:00 A.M.

  • Virtual Defence

Reviewers

Supervisory Committee

  • Dr. Homayoun Najjaran, Department of Electrical and Computer Engineering, University of Victoria (Supervisor)

  • Dr. David Capron, Department of Electrical and Computer Engineering, UVic (Member) 

External Examiner

  • Dr. Keivan Ahmadi, Department of Mechanical Engineering, University of Washington

Chair of Oral Examination

  • Dr. Bob Kowalewski, Department of Physics and Astronomy, UVic

     

Abstract

 

While deep neural networks (DNNs) have demonstrated remarkable proficiency in diverse tasks spanning various domains, the reliability of their predictions remains a subject of ongoing research. In the context of classification problems, there is a common misconception regarding probabilities generated by DNNs, falsely equating them with the confidence of the models in their assigned classes. The incorporation of the softmax layer at the end of the network compels models to convert the activations to probabilistic values between 0 and 1, irrespective of the underlying activation values themselves. When activations are insufficient for accurate decision-making, raising uncertainty about the correct classification, it is preferable for a model to quantify its uncertainty about the true classification of the input data rather than making uncertain decisions. In this light, this study proposes a distance-based evidential deep learning (d-EDL) classifier with an additive capability for uncertainty quantification (UQ). The d-EDL classifier comprises two key components: the first utilizes convolutional neural network (CNN) layers for feature extraction, while the second incorporates designed layers for decision-making. In the second component, the first layer calculates basic probability assignments (BPAs) from the extracted feature vectors using a distance metric, measuring proximity between an input pattern and selected data representatives. A clustering algorithm is employed to form representatives for each data label; the closeness to a label representative reflects the potential belonging of the input to that label. The second and third layers employ combination rules to merge BPAs, leveraging probability theory and Dempster-Shafer (D-S) theory. The output of the d-EDL network is a probability distribution extended to include uncertainty as a class. An end-to-end training method is provided to train the proposed classifier, enabling joint learning and updating all network parameters. Five variants of the d-EDL classifier, each with a different number of data representatives, are trained on an image dataset, and their uncertainty quantification ability is assessed. The assessment involves evaluating models in three scenarios, each with a common misclassification leading factor: noise, image rotation, and out-of-distribution (OOD) data. The results demonstrate the great capability of d-EDLs, especially the ones having 20 and 40 data representatives, to effectively quantify uncertainty rather than misclassifying when faced with unfamiliar data.