Seminar| Institute of Mathematical Sciences
Time:Monday, July 28th, 2025,14:30-15:30
Location:IMS, RS408
Speaker: Yu Luo, King’s College London
Abstract: In the usual Bayesian setting, a full probabilistic model is required to link the data and parameters, and the form of this model and the inference and prediction mechanisms are specified via de Finetti's representation. In general, such a formulation is not robust to model mis-specification of its component parts. An alternative approach is to draw inference based on loss functions, where the quantity of interest is defined as a minimizer of some expected loss, and to construct posterior distributions based on the loss-based formulation; this strategy underpins the construction of the Gibbs posterior. We develop a Bayesian non-parametric approach; specifically, we generalize the Bayesian bootstrap and specify a Dirichlet process model for the distribution of the observables. We implement this using direct prior-to-posterior calculations but also using predictive sampling. The two updating frameworks yield the same posterior distribution under the exchangeability assumption and guarantee consistent estimation under mild conditions. We also study the assessment of posterior validity for non-standard Bayesian calculations. The methodology is demonstrated via the partially linear model.