f(x) = 1 / (1 + exp(-alpha*x)) , where alpha is a "slope parameter"

The logistic sigmoid function is commonly found in Neural Network applications - nice and nonlinear, acts as sort of a saturating switch. Looks sorta like

            ---------
           /
          /
-------------------------( this is axis )
         /
        /
--------


Uh, but curvier. Alpha > 1 makes it sharper, alpha < 1 makes it smushier.

Derivative can be expressed as alpha * f(x)* (1-f(x)) , which is handy because it's an expression of f(x), which you're already gonna know.

Log in or register to write something here or to contact authors.