How to choose suitable activation functions for deep learning?

Hi Team,
Would like to know how to select suitable Activation functions for the model?
Also, please share cheatsheet/articles/videos if possible that will guide us.
@PrajwalPrashanth

Thank you for the awesome lecture series @aakashns Sir!!!

1 Like

@nikhildarade

I haven’t seen a cheat sheet of sorts for this. Here are my points.

  • Use RELU for layer activation its the most popular and widely used for all kinds of application, you won’t go wrong by much with this. There are different variations like SELU and all, you can try if you want. There is new one called SWISH which you can check it has improved the performances for some, but slightly slower.
  • Softmax when you want to choose one out of many (Ideal for last layer in multi class classification)
  • Sigmoid when there can few out of many (ideal for last layer in multi label classification)

Start with these learn if anything new comes and experiment.
Courtesy- @PrajwalPrashanth

1 Like