An activation function is a mathematical function used in artificial neural networks to determine the output of a neuron from its inputs. It is used to introduce non-linearity into the network, allowing it to learn more complex functions.
For example, the most commonly used activation function is the Rectified Linear Unit (ReLU). This function takes the input and returns 0 if it is negative, and returns the input if it is positive. This allows the network to learn more complex functions, like distinguishing between different classes of data.