Papers
		
		
	
	arxiv:2409.10821
		PReLU: Yet Another Single-Layer Solution to the XOR Problem
Published on Sep 17, 2024
		Authors:
		
			
			
		
Abstract
A single-layer neural network with PReLU activation can solve the XOR problem with high efficiency and success rate compared to MLP and GCU,
					AI-generated summary
				
This paper demonstrates that a single-layer neural network using Parametric Rectified Linear Unit (PReLU) activation can solve the XOR problem, a simple fact that has been overlooked so far. We compare this solution to the multi-layer perceptron (MLP) and the Growing Cosine Unit (GCU) activation function and explain why PReLU enables this capability. Our results show that the single-layer PReLU network can achieve 100\% success rate in a wider range of learning rates while using only three learnable parameters.
Models citing this paper 0
No model linking this paper
Cite arxiv.org/abs/2409.10821 in a model README.md to link it from this page.
				
			Datasets citing this paper 0
No dataset linking this paper
Cite arxiv.org/abs/2409.10821 in a dataset README.md to link it from this page.
				
			Spaces citing this paper 0
No Space linking this paper
Cite arxiv.org/abs/2409.10821 in a Space README.md to link it from this page.