Random Forest — Mesmerising Classroom

Grow many tiny decision trees, add friendly randomness, then let them vote 🎉. Play with sliders, watch the decision boundary dance, and learn AI/ML by seeing it.

1) Build Your Data 🌈

Tip: Increase Noise to make the task harder and watch the forest adapt.

2) Random Forest Settings ⚙️

Step 1 — Grow Tiny Trees

  • Each tree learns from a random sample.
  • Splits by reducing Gini impurity.
  • Stops when pure or too small.

Step 2 — Add Randomness

  • At each split, consider a random subset of features.
  • Decorrelate trees → diverse opinions.
  • Diversity makes the forest strong.

Step 3 — Vote!

  • Average probabilities across trees.
  • Majority vote decides the class.

Step 4 — Check OOB

  • Out-Of-Bag points ≈ free test set.
  • OOB Accuracy updates live.

Decision Boundary & Data ✨

Colors show which class the forest prefers at each location. Dots are training points. Try the sliders!

Feature Importance 🌟

Estimated via impurity decrease. Bigger bars ⇒ more helpful feature.

OOB Accuracy vs Trees 📈