This section demonstrates constructions of bagging models, random forests, and boosting for classification and regression. Each of these relies on the decision tree constructions from the last chapter. Bagging and random forests involve simple combinations of decision trees with only minor adjustments to the tree structure; therefore, the code for those sections is limited. The section on boosting involves more original code.