The Akaike Information Criterion (AIC) gauges the relative quality of statistical models for a given dataset. It estimates the information lost when a particular model is used to represent the process that generated the data. A lower AIC value suggests a better model fit, balancing goodness of fit with model complexity. For example, given two models applied to the same dataset, the model with the lower AIC is preferred. Calculating the AIC involves determining the model’s maximum likelihood estimate and the number of estimated parameters. The formula is AIC = 2k – 2ln(L), where k is the number of parameters and L is the maximized likelihood function.
This metric is valuable in model selection, providing a rigorous, objective means to compare different models. By penalizing models with more parameters, it helps avoid overfitting, thus promoting models that generalize well to new data. Introduced by Hirotugu Akaike in 1973, it has become a cornerstone of statistical modeling and is widely used across disciplines, including ecology, economics, and engineering, for tasks ranging from variable selection to time series analysis. Its application allows researchers to identify models that explain the data effectively without unnecessary complexity.