An Adaptive Sigmoidal Activation Function for Training Feed Forward Neural Network Equalizer

Authors

  • Zohra Zerdoumı Author
  • Fadila Benmeddour Author
  • Latifa Abdou Author
  • Djamel Benatıa Author

DOI:

https://doi.org/10.55549/epstem.1050144

Keywords:

Non linear equalization, Feed for word neural networks (FFNN), Digital communication channels, Adaptive sigmoidal activation function

Abstract

Feed for word neural networks (FFNN) have attracted a great attention, in digital communication area. Especially they are investigated as nonlinear equalizers at the receiver, to mitigate channel distortions and additive noise. The major drawback of the FFNN is their extensive training. We present a new approach to enhance their training efficiency by adapting the activation function. Adapting procedure for activation function extensively increases the flexibility and the nonlinear approximation capability of FFNN. Consequently, the learning process presents better performances, offers more flexibility and enhances nonlinear capability of NN structure thus the final state kept away from undesired saturation regions. The effectiveness of the proposed method is demonstrated through different challenging channel models, it performs quite well for nonlinear channels which are severe and hard to equalize. The performance is measured throughout, convergence properties, minimum bit error achieved. The proposed algorithm was found to converge rapidly, and accomplish the minimum steady state value. All simulation shows that the proposed method improves significantly the training efficiency of FFNN based equalizer compared to the standard training one.

Downloads

Published

2021-12-31

Issue

Section

Articles

How to Cite

An Adaptive Sigmoidal Activation Function for Training Feed Forward Neural Network Equalizer. (2021). The Eurasia Proceedings of Science, Technology, Engineering and Mathematics, 14, 1-7. https://doi.org/10.55549/epstem.1050144