Spread Factor Hyper-Parameter Tuning on Radial Basis Function and Generalized Regression Neural Networks of Probabilistic Neural Network Model for Optimal Prediction of Signal Power Loss in Micro-Cell Built-Up Urban Area

Virginia C. Ebhota, Thokozani Shongwe

Research output: Contribution to journalArticlepeer-review

Abstract

This work proposed and utilized two artificial neural network models to address the problems associated with the application of conventional models in the prediction of signal power loss. The Generalized Regression (GR) and the Radial Basis Function (RBF) neural networks were employed to enhance signal power loss prediction performance and accuracy. A comparison of the predictive abilities of the two neural network models was analyzed with the application of a spread factor hyperparameter. The results demonstrate the importance of the right tuning of the spread-factor hyperparameter for an excellent performance of the two models in the prediction of signal power loss and infer that for optimum neural network training with the radial basis function neural network model, a large spread-factor is required for the intermediate neurons to appropriately respond to the intersecting region of the input space. However, it shouldn’t be so large that the neurons fundamentally begin to respond in the same manner. Training the neural network model with the generalized regression neural network model, on the other hand, shows an increase in the prediction performance as the spread factor hyperparameter gets larger. The model approximates virtually every function with a huge data size and requires a minute fraction of dataset training for a desired result. When the spread factor is made large, there is force on the estimated density to be smooth and becomes multi-variant. Gaussian has covariance and tends to respond with a target vector associated with the closest input vector design. A small spread factor of the generalized regression neural network model results in a very steep radial basis function such that the neurons with the closest weight vector to the input have a much bigger output than the other neurons, but as the spread gets bigger, the radial basis function slope becomes smothered and many neurons will respond to the input vector.

Original languageEnglish
Pages (from-to)358-372
Number of pages15
JournalInternational Review on Modelling and Simulations
Volume17
Issue number5
DOIs
Publication statusPublished - 2024

Keywords

  • Generalized Regression Neural Network
  • Hyper-Parameter Tuning
  • Loss Function
  • Radial Basis Neural Network
  • Signal Power Loss Prediction
  • Spread Factor Tuning

ASJC Scopus subject areas

  • Modeling and Simulation
  • General Chemical Engineering
  • Mechanical Engineering
  • Logic
  • Discrete Mathematics and Combinatorics
  • Electrical and Electronic Engineering
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Spread Factor Hyper-Parameter Tuning on Radial Basis Function and Generalized Regression Neural Networks of Probabilistic Neural Network Model for Optimal Prediction of Signal Power Loss in Micro-Cell Built-Up Urban Area'. Together they form a unique fingerprint.

Cite this