Idea 1. AdaBoost combined a lot of weak learners to make predictions, and these weak learners are almost always stumps.

Dataset with inputs: Chest pain, Blocked Arteries, Patient Weight and output: Heart Disease
If there is Chest Pain → there will be Heart Disease, otherwise Not

Idea 2. Some stumps get more amount of say/vote in the final prediction than the other stumps, which usually depends of how much errror a stump made. Unlike Random Forest, in which each tree has equal say/vote in final prediction.

Idea 3. Each stump is made by taking the previous stump’s mistake into account. Unlike in Random Forest each tree is made independently of the other.

Dataset with added Sample weight
Gini Index for Numerical value-Patient Weight
Amount of say formula
Increase sample weight formula
Plot e^x, x= Amount of say
Plot e^-x, x= Amount of say
Select sample based on random number ‘s’
Randomly selected samples

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store