TY - JOUR
T1 - Estimation of selected parameters
AU - Pan, Jia Chiun
AU - Huang, Yufen
AU - Hwang, J. T.Gene
N1 - Funding Information:
We would like to thank the referees for their valuable and insightful suggestions that aided our revision. The research work of Pan and Hwang was supported by the National Science Council, Taiwan, Grant No. NSC 100-2118-M-194-004-MY3. Hwang's research was also partially supported by a research grant from the Foundation for the Advancement of Outstanding Scholarship, Taiwan. Huang's research is supported by the National Science Council, Taiwan, Grant No. NSC 102-2118-M-194-003-MY2. The authors thank Ms. Rebecca Brody for her careful editing of the numerous earlier versions, which leads to the current improved version.
Publisher Copyright:
© 2016 Elsevier B.V.
PY - 2017/5/1
Y1 - 2017/5/1
N2 - Modern statistical problems often involve selection of populations (or genes for example) using the observations. After selecting the populations, it is important to estimate the corresponding parameters. These quantities are called the selected parameters. Using traditional estimators, such as maximum likelihood (ML) estimator, which ignores the selection can result in a large bias. It is, however, known that the Bayes estimator that ignores the selection still works well under the assumed prior distribution. But, when the prior distribution used to derive the Bayes estimator is very different from the “true” prior, the Bayes estimator can fail. The paper aims to construct estimators for the selected parameters which are robust to prior distributions. A generalization of the multiple-shrinkage Stein type estimator proposed by George (1986a, 1986b) is proposed and is shown to have a small selection bias for estimating the selected means and have an attractive small expected mean squared error. With respect to these two criteria, the proposed estimator is generally better than ML estimator, Lindley–James–Stein (LJS) estimator and Efron–Tweedie (Efron, 2011) estimator.
AB - Modern statistical problems often involve selection of populations (or genes for example) using the observations. After selecting the populations, it is important to estimate the corresponding parameters. These quantities are called the selected parameters. Using traditional estimators, such as maximum likelihood (ML) estimator, which ignores the selection can result in a large bias. It is, however, known that the Bayes estimator that ignores the selection still works well under the assumed prior distribution. But, when the prior distribution used to derive the Bayes estimator is very different from the “true” prior, the Bayes estimator can fail. The paper aims to construct estimators for the selected parameters which are robust to prior distributions. A generalization of the multiple-shrinkage Stein type estimator proposed by George (1986a, 1986b) is proposed and is shown to have a small selection bias for estimating the selected means and have an attractive small expected mean squared error. With respect to these two criteria, the proposed estimator is generally better than ML estimator, Lindley–James–Stein (LJS) estimator and Efron–Tweedie (Efron, 2011) estimator.
UR - http://www.scopus.com/inward/record.url?scp=85007107078&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85007107078&partnerID=8YFLogxK
U2 - 10.1016/j.csda.2016.11.001
DO - 10.1016/j.csda.2016.11.001
M3 - Article
AN - SCOPUS:85007107078
SN - 0167-9473
VL - 109
SP - 45
EP - 63
JO - Computational Statistics and Data Analysis
JF - Computational Statistics and Data Analysis
ER -