Imagine you have a magic box that takes a number and gives you a result.
-
If the number is positive (like 5, 10, or 0.3), the box just gives it back to you.
→ Input: 5 → Output: 5 -
But if the number is negative (like -2 or -7), the box gives you zero instead.
→ Input: -2 → Output: 0
That’s all! 🎉
In math, it looks like this:
So the ReLU function helps a neural network ignore negative signals (turns them off) while keeping positive ones (lets them pass through).
It’s simple but super powerful — kind of like a light switch that only turns on when there’s enough electricity (a positive signal)! ⚡
No comments:
Post a Comment