Project Search

or

Kernel Density Graphs

Team: KDG

Project Description:

Classical machine learning (ML) algorithms yield overconfident predictions when given out-of-distribution (OOD) data. Neural networks and random forest algorithms divide the decision space into polytopes that extend infinitely beyond the training data, and learn affine transforms over them. This leads to overconfident predictions in OOD regions. We rectify this by learning Gaussian kernels over the polytopes learnt by these algorithms. Then, we estimate appropriate confidence values on the basis of class conditional posterior estimates.

Project Photo:

Student Team Members

Course Faculty

    Project Mentors, Sponsors, and Partners