Edit This Guide Record
Guides Technology Ensemble Learning – Brief Overview

Ensemble Learning – Brief Overview

Published on 11/18/2016 | Technology

270 2

Paul Pallath

Accelerating the Adoption of Industrial Internet of Things.

IoT GUIDE

Overview

Before making major life decisions we naturally like to consult others first. Whether the issue is personal or professional, we will seek advice from family, friends, and experts, not to mention social media and other Internet sites. The inclusion of different perspectives in our decision-making process increases the chances of success. Ensemble Learning in Predictive Modelling follows a similar holistic path to finding the right way to achieve better a predictive performance.

 

The Ensemble Learning process combines predictions from several models, built from different parts of the input data space, which can be based on same or different algorithms.

 

Popular techniques in Ensemble Learning are Bagging, Boosting and Random Forest. They combine a large number of weak learners (less accurate classifiers/regressors) to create one strong learner (highly accurate predictive classifiers/regressors).

 

To define strong learner predictions, using the predictions of weak learners, Ensemble Learning either uses the average value of the predictions from weak learners, or uses the prediction value that has been reported by most of the weak learners.

 

Three example techniques are explained in detail below.

Bagging ( or Bootstrapped aggregating )

• The least complex of the three techniques

• Large numbers of training datasets are created from an entire training dataset, by taking random samples from the entire training dataset ( with replacements). The process of sampling with replacement is known as bootstrapping, and the samples are referred as bootstrap samples.

• Each training dataset is used to train a classifier / regressor, using the same underlying base algorithm.

• The predictions from each individual classifier / regressor are combined for generating the prediction for the ensemble model. In case of classification the prediction of individual classifiers are combined by taking a either a simple majority vote or weighted vote, while the average of the predictions is taken for a regressor.

• The Bagging algorithm takes the number of trees to be generated as an input parameter.

Boosting

• Similar to bagging but with one difference. It uses the entire data to train the classifier / regressor

• After the creation of each model, the instances that are misclassified in the previous models are given more weight and the subsequent models give more importance to the misclassified records while training the new model.

• The results of each of the individual classifiers / regressors are combined in the same manner as it is done in Bagging.

RandomForest

• Striking similarities with Bagging.

• The number of trees to be created in the Random Forrest is an input similar to Bagging.

• Each classifier / regressor is built on a bootstrapped sample, however at each split a random sample of m features are considered for splitting.

• This parameter “m” is usually 1/3*M for regression and sqrt(M) for classification, where “M” is the total number of features in the dataset.

• The predictions from the individual classifiers are combined using the same technique used in bagging.

SAP Predictive Analytics provides bagged CNR, boosted CNR and Random Forest as algorithms that an expert user can use for complex modelling.

 

You can find the original article here.

test test