Close Menu

AIC: Towards More Robust Model Selection in Ridge Regression


Oct 24, 2017 - 11:30am to 12:30pm


Rettaliata Engineering Center, Room 032


Matthew Dixon
Stuart School of Business, Illinois Institute of Technology


The problem of estimating the dimensionality of a model occurs in various forms in applied statistics, machine learning and econometrics. Popular approaches include Akaike’s Information Criterion (AIC), cross-validation and Minimum Description Length (MDL) among others. This talk begins by discussing Akaike’s 1973 extension of the maximum likelihood principle for model selection. In particular we revisit the rather restrictive conditions under which minimizing the AIC is equivalent to finding the minimum expected Kulllback-Leibler divergence. We then use a Wilks test to establish asymptotic convergence of the divergence to a chi-squared distribution and characterize the importance of the Fisher Information matrix in AIC regularization. We conclude by proposing a new methodology for L2 regularization in ridge regression. This is the joint work with Tyler Ward (Google). 

Event Type: 

Department of Applied Mathematics - Mathematical Finance and Stochastic Analysis Seminar