Tanh
Imagine a squishy line that takes any number — big or small — and squeezes it so it always stays between -1 and +1.
-
If you give it a big positive number, it goes up close to +1.
→ Input: +10 → Output: 0.999 (almost +1) -
If you give it a big negative number, it goes down close to -1.
→ Input: -10 → Output: -0.999 (almost -1) -
If you give it 0, it gives you 0 (right in the middle).
→ Input: 0 → Output: 0
Imagine you have a magic box like the ReLU one — it passes positive numbers through and turns negative numbers into zero.
But sometimes, that can be a problem 😕 — because if everything becomes zero, the network can’t learn!
So, Leaky ReLU fixes that by letting a tiny bit of the negative numbers “leak” through instead of blocking them completely.
📦 How it works:
-
If the number is positive, it stays the same.
→ Input: +5 → Output: +5 -
If the number is negative, it becomes a small negative number instead of 0.
→ Input: -5 → Output: -0.05 (just a small leak!)
🧮 In math form:
(The 0.01 is the “leakiness” — how much of the negative value gets through.)
💡 Why it’s useful:
Leaky ReLU keeps the neurons alive even when the inputs are negative — kind of like a door that doesn’t shut completely, letting a little light shine through 🌤️
So the network keeps learning instead of getting “stuck in the dark”!
No comments:
Post a Comment