Bagging (Bootstrap aggregating) is a meta-learning technique that involves creation of an ensemble of models based on random training sets and created from the original training set by sampling with replacement.

The final model is a simple average of the individual models within the ensemble.

In other words, bagging involves:

Bagging achieves two important goals: validation and assessment of predictive uncertainty, that is: