主讲人 |
汤家豪 |
简介 |
<p>Models with unnormalized probability density functions are ubiquitous in statistics, artificial intelligence and many other fields. However, they face significant challenges in model selection if the normalizing constants are intractable. Existing methods to address this issue often incur high computational costs, either due to numerical approximations of normalizing constants or evaluation of bias corrections in information criteria. In this talk, we start by introducing a new entropy that underpins the Fisher divergence. Then we propose a novel and fast selection criterion, T-GIC, for nested models, allowing the data to be sampled from a possibly unnormalized probability density function. T-GIC gives a consistent selection under mild regularity conditions and is computationally efficient, benefiting from a multiplying factor that depends only on the sample size and the model complexity. Extensive simulation studies and real-data applications demonstrate the efficacy of T-GIC in the selection of models with unnormalized probability density function.</p> |