TY - JOUR
T1 - Generalized Dirichlet priors for Naïve Bayesian classifiers with multinomial models in document classification
AU - Wong, Tzu Tsung
N1 - Funding Information:
Acknowledgements This research was supported by the National Science Council in Taiwan under Grant No. 98-2410-H-006-008.
PY - 2014/1
Y1 - 2014/1
N2 - The generalized Dirichlet distribution has been shown to be a more appropriate prior than the Dirichlet distribution for naïve Bayesian classifiers. When the dimension of a generalized Dirichlet random vector is large, the computational effort for calculating the expected value of a random variable can be high. In document classification, the number of distinct words that is the dimension of a prior for naïve Bayesian classifiers is generally more than ten thousand. Generalized Dirichlet priors can therefore be inapplicable for document classification from the viewpoint of computational efficiency. In this paper, some properties of the generalized Dirichlet distribution are established to accelerate the calculation of the expected values of random variables. Those properties are then used to construct noninformative generalized Dirichlet priors for naïve Bayesian classifiers with multinomial models. Our experimental results on two document sets show that generalized Dirichlet priors can achieve a significantly higher prediction accuracy and that the computational efficiency of naïve Bayesian classifiers is preserved.
AB - The generalized Dirichlet distribution has been shown to be a more appropriate prior than the Dirichlet distribution for naïve Bayesian classifiers. When the dimension of a generalized Dirichlet random vector is large, the computational effort for calculating the expected value of a random variable can be high. In document classification, the number of distinct words that is the dimension of a prior for naïve Bayesian classifiers is generally more than ten thousand. Generalized Dirichlet priors can therefore be inapplicable for document classification from the viewpoint of computational efficiency. In this paper, some properties of the generalized Dirichlet distribution are established to accelerate the calculation of the expected values of random variables. Those properties are then used to construct noninformative generalized Dirichlet priors for naïve Bayesian classifiers with multinomial models. Our experimental results on two document sets show that generalized Dirichlet priors can achieve a significantly higher prediction accuracy and that the computational efficiency of naïve Bayesian classifiers is preserved.
UR - http://www.scopus.com/inward/record.url?scp=84891881106&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84891881106&partnerID=8YFLogxK
U2 - 10.1007/s10618-012-0296-4
DO - 10.1007/s10618-012-0296-4
M3 - Article
AN - SCOPUS:84891881106
SN - 1384-5810
VL - 28
SP - 123
EP - 144
JO - Data Mining and Knowledge Discovery
JF - Data Mining and Knowledge Discovery
IS - 1
ER -