Estimation of selected parameters

Jia Chiun Pan, Yufen Huang, J. T.Gene Hwang

Research output: Contribution to journalArticlepeer-review

3 Citations (Scopus)

Abstract

Modern statistical problems often involve selection of populations (or genes for example) using the observations. After selecting the populations, it is important to estimate the corresponding parameters. These quantities are called the selected parameters. Using traditional estimators, such as maximum likelihood (ML) estimator, which ignores the selection can result in a large bias. It is, however, known that the Bayes estimator that ignores the selection still works well under the assumed prior distribution. But, when the prior distribution used to derive the Bayes estimator is very different from the “true” prior, the Bayes estimator can fail. The paper aims to construct estimators for the selected parameters which are robust to prior distributions. A generalization of the multiple-shrinkage Stein type estimator proposed by George (1986a, 1986b) is proposed and is shown to have a small selection bias for estimating the selected means and have an attractive small expected mean squared error. With respect to these two criteria, the proposed estimator is generally better than ML estimator, Lindley–James–Stein (LJS) estimator and Efron–Tweedie (Efron, 2011) estimator.

Original languageEnglish
Pages (from-to)45-63
Number of pages19
JournalComputational Statistics and Data Analysis
Volume109
DOIs
Publication statusPublished - 2017 May 1

All Science Journal Classification (ASJC) codes

  • Statistics and Probability
  • Computational Mathematics
  • Computational Theory and Mathematics
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Estimation of selected parameters'. Together they form a unique fingerprint.

Cite this