Kernel mixture model for probability density estimation in Bayesian classifiers

Wenyu Zhang, Zhenjiang Zhang, Han Chieh Chao, Fan Hsun Tseng

Research output: Contribution to journalArticlepeer-review

29 Citations (Scopus)

Abstract

Estimating reliable class-conditional probability is the prerequisite to implement Bayesian classifiers, and how to estimate the probability density functions (PDFs) is also a fundamental problem for other probabilistic induction algorithms. The finite mixture model (FMM) is able to represent arbitrary complex PDFs by using a mixture of mutimodal distributions, but it assumes that the component mixtures follows a given distribution, which may not be satisfied for real world data. This paper presents a non-parametric kernel mixture model (KMM) based probability density estimation approach, in which the data sample of a class is assumed to be drawn by several unknown independent hidden subclasses. Unlike traditional FMM schemes, we simply use the k-means clustering algorithm to partition the data sample into several independent components, and the regional density diversities of components are combined using the Bayes theorem. On the basis of the proposed kernel mixture model, we present a three-step Bayesian classifier, which includes partitioning, structure learning, and PDF estimation. Experimental results show that KMM is able to improve the quality of estimated PDFs of conventional kernel density estimation (KDE) method, and also show that KMM-based Bayesian classifiers outperforms existing Gaussian, GMM, and KDE-based Bayesian classifiers.

Original languageEnglish
Pages (from-to)675-707
Number of pages33
JournalData Mining and Knowledge Discovery
Volume32
Issue number3
DOIs
Publication statusPublished - 2018 May 1

All Science Journal Classification (ASJC) codes

  • Information Systems
  • Computer Science Applications
  • Computer Networks and Communications

Fingerprint

Dive into the research topics of 'Kernel mixture model for probability density estimation in Bayesian classifiers'. Together they form a unique fingerprint.

Cite this