Alternative prior assumptions for improving the performance of naïve Bayesian classifiers

研究成果: Article同行評審

34 引文 斯高帕斯(Scopus)

摘要

The prior distribution of an attribute in a naïve Bayesian classifier is typically assumed to be a Dirichlet distribution, and this is called the Dirichlet assumption. The variables in a Dirichlet random vector can never be positively correlated and must have the same confidence level as measured by normalized variance. Both the generalized Dirichlet and the Liouville distributions include the Dirichlet distribution as a special case. These two multivariate distributions, also defined on the unit simplex, are employed to investigate the impact of the Dirichlet assumption in naïve Bayesian classifiers. We propose methods to construct appropriate generalized Dirichlet and Liouville priors for naïve Bayesian classifiers. Our experimental results on 18 data sets reveal that the generalized Dirichlet distribution has the best performance among the three distribution families. Not only is the Dirichlet assumption inappropriate, but also forcing the variables in a prior to be all positively correlated can deteriorate the performance of the naïve Bayesian classifier.

原文English
頁(從 - 到)183-213
頁數31
期刊Data Mining and Knowledge Discovery
18
發行號2
DOIs
出版狀態Published - 2009 4月

All Science Journal Classification (ASJC) codes

  • 資訊系統
  • 電腦科學應用
  • 電腦網路與通信

指紋

深入研究「Alternative prior assumptions for improving the performance of naïve Bayesian classifiers」主題。共同形成了獨特的指紋。

引用此