Progressively Balanced Multi-class Neural Trees

Mentor: Dr. Prithwijit Guha (Dept. of EEE, IIT Guwahati)

Collaborator: Spoorthy Bhat

This project was my Bachelor’s thesis at IIT Guwahati.

The basic idea is to allow each node to learn oblique partitions in the input space as opposed to the axis-aligned partitions learned by a standard C4.5 tree. This required the development of a differentiable criterion.

During the course of the project we realized that this hierarchical partitioning results in data-thinning (i.e. as we go down the tree, the number of samples reaching a node go on decreasing). This causes the trained nodes to be biased when class distributions become unbalanced. Consequently, we resort to data balancing. We tested different methods of artificially balancing the class distribution in the data at each node.

Our method gave comparable accuracy to existing classifiers on 10 benchmark datasets. What is the gain? The learned trees require fewer computations at test time (on average over the test set) than a multi-layer perceptron (MLP) of comparable accuracy.

We designed the framework for training the decision tree and node parameters, and testing the classifier using the Tensorflow package.

Ameya Godbole
Ameya Godbole
Research Fellow in Computer Science

My research interests include artificial intelligence in general but mostly I am focused on question answering.

Related