Support Vector Machines (SVM) — Cute & Mesmerising Lesson 🌈

SVMs are used for classification and regression. They find an optimal hyperplane that separates classes with the widest margin (or an ε-tube for regression). Poke the sliders, try Auto Maximize, and spin the 3D kernel magic!

🧠

Step 1 Hyperplane & Margin

In 2D, a line splits the classes. SVM chooses the one with the largest margin to the closest points.
🎚️

Step 2 Soft Margin (C)

Real data is messy. A higher C punishes mistakes (tighter margin); lower C allows more slack (wider margin).

Step 3 Kernel Trick

Lift data to a higher-D space (e.g., add z = x² + y²). A flat plane there becomes a curvy boundary in 2D.
Tip: Switch to Kernel 3D Magic and spin the scene. Switch datasets (Blobs/Moons) above.

Controls

Nearest points to the line glow: these are the support vectors.

📈C vs Margin (intuition)

As C increases, the model tries harder to avoid errors and the margin tends to shrink (teaching sketch, not exact math).

🐱🐶 Example 1: Cats vs Dogs

Features: height & weight. A linear SVM works if big dogs lie on one side of a line and small cats on the other. The closest cat+dog are the support vectors.

🌙 Example 2: Interlocking Moons

A straight line fails. Kernel SVM lifts points so a flat plane works; mapped back it’s a curvy boundary.

💰 Example 3: Predict a Price (SVR)

Fit a line with an ε-tube. Points outside the tube (errors > ε) become the support vectors that adjust the prediction.