SEARCH
You are in browse mode. You must login to use MEMORY

   Log in to start


From course:

Intro to AI 2

» Start this Course
(Practice similar questions for free)
Question:

Activation functions

Author: Christian N



Answer:

(a) is a step function or threshold function (b) is a rectified linear function ReLU(x): max(0,x) The smooth version (everywhere-differentiable) of ReLU is called soft plus softPlus(x) : log(1 + eX) Changing the bias weight W0,i moves the threshold location


0 / 5  (0 ratings)

1 answer(s) in total