Nonparametric prediction of stationary time series
Viernes 11 de junio, Módulo 17 (antiguo C-XV), aula 520 a las 11:30 hr.
Resumen: We present simple procedures for the prediction of a real valued time series with side information. For squared loss (regression problem), survey the basic principles of universally consistent nonparametric regression function estimates. The prediction algorithms are based on a machine aggregation of several simple predictors. We show that if the sequence is a realization of a stationary and ergodic random process then the average of squared errors converges, almost surely, to that of the optimum, given by the Bayes predictor. We offer an analog result for the prediction of stationary gaussian processes. These prediction strategies have some consequences for $0-1$ loss (pattern recognition problem).