|
主讲人 |
喻达磊 |
简介 |
<p> We propose a unified time-varying model averaging approach that accommodates general loss functions, including Lin-lin loss and asymmetric squared error loss, to improve prediction performance under structural change. This flexibility enables averaging across diverse candidate models, such as time-varying coefficient quantile regression models. We develop a local forward-validation criterion to determine time-varying combination weights without the standard constraint of summing up to 1 and establish theoretical justifications previously unexplored in the literature. First, when all candidate models are misspecified, the proposed averaging prediction is asymptotically optimal in the sense of achieving the lowest possible prediction risk with a convergence rate. Second, we establish a novel convergence rate for time-varying weight consistency that does not depend on the extent of misspecification among the candidate models. Furthermore, we develop a time-varying sparsity-oriented importance learning procedure that consistently identifies the true predictor set. Monte Carlo simulations and empirical applications demonstrate superior finite-sample performance relative to existing model selection and averaging methods.</p> |