ACS Applied Computer Science

  • Increase font size
  • Default font size
  • Decrease font size

USING GA FOR EVOLVING WEIGHTS IN NEURAL NETWORKS

Print
This article aims at studying the behavior of different types of crossover operators in the performance of Genetic Algorithm. We have also studied the effects of the parameters and variables (crossover probability (pc), mutation probability (pm), population size (pop-size) and number of generation (NG)) for controlling the algorithm. This research accumulated most of the types of crossover operators these types are implemented on evolving weights of Neural Network problem. The article investigates the role of crossover in GAs with respect to this problem, by using a comparative study between the iteration results obtained from changing the parameters values (crossover probability, mutation rate, population size and number of generation). From the experimental results, the best parameters values for the Evolving Weights of XOR-NN problem are NG = 1000, pop-size = 50, pm = 0.001, pc = 0.5 and the best operator is Line Recombination crossover.
  • APA 6th style
Hameed, W. M., & Kanbar, A. B. (2019). Using GA for evolving weights in neural networks. Applied Computer Science, 15(3), 21-33. doi:10.23743/acs-2019-18
  • Chicago style
Hameed, Wafaa Mustafa, and Asan Baker Kanbar. "Using Ga for Evolving Weights in Neural Networks." Applied Computer Science 15, no. 3 (2019): 21-33.
  • IEEE style
W. M. Hameed and A. B. Kanbar, "Using GA for evolving weights in neural networks," Applied Computer Science, vol. 15, no. 3, pp. 21-33, 2019, doi: 10.23743/acs-2019-18.
  • Vancouver style
Hameed WM, Kanbar AB. Using GA for evolving weights in neural networks. Applied Computer Science. 2019;15(3):21-33.