CFE 2019: Start Registration
View Submission - CMStatistics
Title: Criteria for Bayesian hypothesis testing in two-sample problems Authors:  Victor Pena - Baruch College, City University of New York (United States) [presenting]
Abstract: Two criteria for prior choice in two-sample testing are proposed which have a common starting point: a hypothetical situation where perfect knowledge about one of the groups is attained, while the data for the other group are assumed to be fixed. In such a scenario, the Bayes decision of the two-sample problem should arguably converge to the Bayes decision of a one-sample test where the distribution of the group for which we obtain perfect information is known. One criterion is based on a limiting argument where the sample size of one of the groups grows to infinity while the sample size of the other group stays fixed, whereas the second criterion is based upon conditioning on the true value of the parameters for one of the groups. In the context of testing whether 2 normal means are equal or not, we find priors where the limiting argument and conditioning give rise to equivalent Bayes decisions under perfect knowledge and cases where they give rise to different Bayes decisions. We also show that, with some prior specifications, the limiting Bayes decisions are not compatible with any prior specification for the one-sample problem where one of the distributions is known.