Defense Date
3-30-2015
Graduation Date
Spring 2015
Availability
Immediate Access
Submission Type
thesis
Degree Name
MS
Department
Computational Mathematics
School
McAnulty College and Graduate School of Liberal Arts
Committee Chair
Karl Wimmer
Committee Member
Jeffrey Jackson
Keywords
Boolean function, DNF expression, graph, ID3, monotone, variable influence
Abstract
Since the Probably Approximately Correct learning model was introduced in 1984, there has been much effort in designing computationally efficient algorithms for learning Boolean functions from random examples drawn from a uniform distribution. In this paper, I take the ID3 information-gain-first classification algorithm and apply it to the task of learning monotone Boolean functions from examples that are uniformly distributed over {0,1}^n. I limited my scope to the class of monotone Boolean functions that can be represented as read-2 width-2 disjunctive normal form expressions. I modeled these functions as graphs and examined each type of connected component contained in these models, i.e. path graphs and cycle graphs. I determined the influence of the variables in the pieces of these graph models in order to understand how ID3 behaves when learning these functions. My findings show that ID3 will produce an optimal decision tree for this class of Boolean functions.
Format
Language
English
Recommended Citation
Thompson, P. (2015). Efficiently Learning Monotone Decision Trees with ID3 (Master's thesis, Duquesne University). Retrieved from https://dsc.duq.edu/etd/1280