top of page
Search

Deep Neural Networks are nonlinear Activation Functions

  • Writer: Arturo Devesa
    Arturo Devesa
  • Mar 2, 2021
  • 1 min read

Almost any process imaginable can be represented as a functional computation in a neural network, provided that the activation function is non-linear.



Non-linear functions address the problems of a linear activation function:

  1. They allow backpropagation because they have a derivative function which is related to the inputs.

  2. They allow “stacking” of multiple layers of neurons to create a deep neural network. Multiple hidden layers of neurons are needed to learn complex data sets with high levels of accuracy.


https://missinglink.ai/guides/neural-network-concepts/7-types-neural-network-activation-functions-right/

 
 
 

Recent Posts

See All
Python Parallel AI Agents

example that illustrates how you might evolve the simple iterative planner–critic loop into a parallel agents design. In this example,...

 
 
 

Comments


©2020 by Arturo Devesa.

bottom of page