Statistics and Data Science Seminar
Prof. Malgorzata Bogdan
Wroclaw University, Poland
Asymptotic Bayes optimality under sparsity of multiple testing and model selection rules
Abstract: In Bogdan et al (Ann. Statist. 2011) the asymptotic framework for the analysis of the Bayes risk of the multiple testing procedures under sparsity is proposed. Within this framework the rule is called Asymptotically Bayes Optimal under Sparsity (ABOS) if the ratio of its risk and the risk of the Bayes oracle converges to 1 as the number of tests, $m$, diverges to infinity and the proportion of alternatives among all tests, $p$, converges to zero. In Bogdan et al (2011) and Neuvial and Roquain (Ann. Statist., to appear) the conditions under which the popular Benjamini-Hochberg and Bonferroni procedures are ABOS are provided. We will discuss these results and provide an extension to the situation where the sample size $n$ used to calculate each of the test statistics goes to infinity with the number of tests $m$. We show that under mild restrictions on the loss function and the distribution of the magnitude of true signals a nontrivial asymptotic inference is possible only if $n$ increases to infinity at least at the rate of $\log m$. Based on this assumption precise conditions are given under which the Bonferroni correction with nominal Family Wise Error Rate (FWER) level $\alpha$ and the Benjamini-Hochberg procedure (BH) at FDR level $\alpha$ are asymptotically optimal.
In the second part of this talk these optimality results are carried over to model selection in the context of multiple regression with orthogonal regressors. Several modifications of Bayesian Information Criterion are considered, controlling either FWER or FDR, and conditions
are provided under which these selection criteria are ABOS. Finally the performance of the multiple testing rules and the model selection criteria is examined in a brief simulation study.
Wednesday November 28, 2012 at 4:00 PM in SEO 636