Jul 22, 2020
Upgrade for ReLU: The sin(x) activation function is an alternative to ReLU.
The activation function known as ReLU builds complex nonlinear functions across layers of a neural network, making functions that outline flat faces and sharp edges. But how much of the world breaks down into perfect polyhedra?