According to the UAT, what must the activation function be to grant a network universal approximation power?

Answer

Non-polynomial

The activation function used in the hidden layer must be non-polynomial, such as sigmoid or ReLU, to allow the network to introduce the necessary non-linearity for modeling complex functions.

According to the UAT, what must the activation function be to grant a network universal approximation power?

#Videos

Why Neural Networks Can Learn Any Function - YouTube

functionalgorithmneural networkapproximation