In the hidden layers of deep learning models, which activation function is commonly employed?
Sigmoid
Rectified Linear Unit (ReLU)
Softmax
Hyperbolic tangent (tanh)

Data Science and Analytics Exercises are loading ...