TY - JOUR T1 - Hardware-Friendly Higher-Order Neural Network Training Using Distributed Evolutionary Algorithms JF - Applied Soft Computing Y1 - 2010 A1 - M. G. Epitropakis A1 - V. P. Plagianakos A1 - M. N. Vrahatis KW - Higher-Order Neural Networks AB - In this paper, we study the class of Higher-Order Neural Networks and especially the Pi-Sigma Networks. The performance of Pi-Sigma Networks is evaluated through several well known Neural Network Training benchmarks. In the experiments reported here, Distributed Evolutionary Algorithms are implemented for Pi-Sigma neural networks training. More specifically the distributed versions of the Differential Evolution and the Particle Swarm Optimization algorithms have been employed. To this end, each processor is assigned a subpopulation of potential solutions. The subpopulations are independently evolved in parallel and occasional migration is employed to allow cooperation between them. The proposed approach is applied to train Pi-Sigma Networks using threshold activation functions. Moreover, the weights and biases were confined to a narrow band of integers, constrained in the range [-32,32]. Thus, the trained Pi-Sigma neural networks can be represented by using 6 bits. Such networks are better suited than the real weight ones for hardware implementation and to some extend are immune to low amplitude noise that possibly contaminates the training data. Experimental results suggest that the proposed training process is fast, stable and reliable and the distributed trained Pi-Sigma Networks exhibited good generalization capabilities. VL - 10 ER - TY - CONF T1 - Higher-Order Neural Networks Training Using Differential Evolution T2 - International Conference of Numerical Analysis and Applied Mathematics Y1 - 2006 A1 - M. G. Epitropakis A1 - V. P. Plagianakos A1 - M. N. Vrahatis JF - International Conference of Numerical Analysis and Applied Mathematics PB - Wiley-VCH CY - Hersonissos, Crete, Greece ER -