Akaike Information Criterion(AIC)

Analyttica Datalab
1 min readJan 7, 2019

--

The ‘Akaike information Criterion’ is a relative measure of the quality of a model for a given set of data and helps in model selection among a finite set of models. It uses the maximized likelihood estimate and the number of parameters to estimate the information lost in the model. The AIC measure gives a trade-off between the model accuracy and model complexity. So, in other words, it prevents us from overfitting.

Here is the formula to define AIC:

AIC = -2(log-likelihood) + 2K

Where:
K is the number of estimated parameters.
Log-likelihood is a measure of model fit.

Application & Interpretation:

The AIC function output can be interpreted as a way to test the models using AIC values. Lower AIC value indicates less information lost hence a better model.

Read also Bayesian Information Criterion(BIC) statistics.

Though these two measures are derived from a different perspective, they are closely related. Apparently, the only difference is, BIC considers the number of observations in the formula, which AIC does not.

Though BIC is always higher than AIC, lower the value of this two measure, better the model.

Practice dataset:

You can log on to https://learn.analyttica.com/.

Other quick reads:

Concordance Check, Support Vector Machine Regression, Bayesian Network Classifier.

--

--

Analyttica Datalab

Analyttica Datalab (www.analyttica.com) is a contextual Data Science (DS) & Machine Learning (ML) Platform Company.