Sunday, 9 November 2025

Deep Learning 03: ReLU (Rectified Linear Unit)

 Imagine you have a magic box that takes a number and gives you a result.

  • If the number is positive (like 5, 10, or 0.3), the box just gives it back to you.
    → Input: 5 → Output: 5

  • But if the number is negative (like -2 or -7), the box gives you zero instead.
    → Input: -2 → Output: 0

That’s all! 🎉


In math, it looks like this:

ReLU(x)=max(0,x)\text{ReLU}(x) = \max(0, x)

So the ReLU function helps a neural network ignore negative signals (turns them off) while keeping positive ones (lets them pass through).

It’s simple but super powerful — kind of like a light switch that only turns on when there’s enough electricity (a positive signal)! ⚡

No comments:

Post a Comment

Data Engineering - Client Interview question regarding data collection.

What is the source of data How the data will be extracted from the source What will the data format be? How often should data be collected? ...