CMStatistics 2023: Start Registration
View Submission - CMStatistics
B1161
Title: Statistical inference with conditionally identically distributed observations Authors:  Pier Giovanni Bissiri - - (Italy) [presenting]
Stephen Walker - University of Texas at Austin (United States)
Abstract: In Bayesian statistics, the most common choice is to assign an exchangeable distribution to the sequence of observations \(X_1,X_2,\dotsc\). It is generally done under de Finetti's Theorem by assessing a prior distribution, which in turn yields a posterior distribution. Exchangeability can be relaxed considering the weaker condition of conditional identical distribution (c.i.d). Moreover, the usual prior-posterior approach can be replaced by a predictive approach, where the distribution of the observations is assessed directly. The c.i.d. condition still ensures the existence of a random probability measure \(\mu\) which is the almost sure weak limit of both the empirical measure \(\sum_{i=1}^n \delta_{X_i}/n\) and the predictive distribution \(P(X_{n+1}\in \cdot \mid X_1,\dotsc,X_n)\). Such entity represents the population where the observations come from and is the object of inference, which would be known if it was possible to observe the entire infinite sequence \(X_1,X_2,\dotsc\). In the c.i.d. setting, it is convenient to assess the distribution of the observations through bivariate copulas. The role which the maximum likelihood estimator can play in such a setting is investigated.