School of Mathematics, University of Edinburgh
Thursday 14th March 2024
Aula Consiglio - VII piano
Link to seminar:
Link to seminar will become active ten minutes before due starting time: Click HERE to refresh page
Cross validation is a well known method for estimating hyperparameters in complex regression models. It comes in many varieties, but some of the more interesting flavours require multiple model fits with consequently high cost. This talk shows how the high cost can be side-stepped for a wide range of models estimated using a quadratically penalized smooth loss, with rather low approximation error. Once the computational cost has the same leading order as a single model fit, it becomes feasible to efficiently optimize the chosen cross-validation criterion with respect to multiple smoothing/precision parameters. Interesting applications include cross-validating smooth additive quantile regression models, and the use of leave-out-neighbourhood cross validation for dealing with nuisance short range autocorrelation. The link between cross validation and the jackknife can be exploited to obtain reasonably well calibrated uncertainty quantification in these case.
Simon Wood is Professor of Statistical Computing and the University of Edinburgh, (and previously Bristol, Bath, Glasgow and St Andrews). After training in physics he worked in mathematical biology, and moved into statistics as a result of an increasing conviction that biological dynamic models should be calibrated. This move resulted in a shift towards smoothing and modern regression models, the production of the quite widely used mgcv package in R for generalized additive modelling, and an associated textbook, alongside a number of journal articles related to smoothing, smoothing parameter estimation, GAM computation and inference. The Covid pandemic re-awakened his interest in the calibration of biological dynamic models, resulting in some work that proved oddly controversial.