The NAS Research Committee invites you to the Fourth Talk in the NAS Seminar Series 2023 that will take place on Thursday, 31 August 2023.
Event Details:
Title: Investigating the Effect of Activation Functions and Sample Sizes on the Performance of Artificial Neural Network Models Through Simulation
Speaker: Dr Alexander Boateng, University of the Free State
Date: Thursday, 31 August 2023
Time: 15:30 – 16:30
Venue: Microsoft Teams, link: Click here to join the meeting
Talk Summary:
Artificial Neural Networks (ANNs) have gained immense recognition as a powerful tool for a wide range of applications, including image recognition, natural language processing, and predictive modelling. Their capacity to decipher intricate patterns and relationships within data makes them highly sought-after. However, the effectiveness of ANNs heavily relies on the choice of activation function and the size of the training dataset. Consequently, this study sought to investigate the impact of activation functions and sample sizes on the performance of artificial neural network models through simulation. The Tanh, Logistic, ReLU, and Identity activation functions were compared in terms of accuracy and training time across diverse sample sizes. Notably, the Tanh activation function consistently showcased superior accuracy levels, ranging from 50% to 97%, surpassing the performance of the other activation functions. However, statistical analysis using ANOVA revealed no significant differences among the mean accuracy values for all activation functions, suggesting that the choice of activation function may not have a substantial impact on overall accuracy. Interestingly, the identity activation function demonstrated significantly shorter training times, averaging 0.27 seconds, in comparison to the Tanh activation function’s 1.66 seconds. Nevertheless, ANOVA analysis indicated no significant disparities among the mean training times for all activation functions, implying that the selection of activation function does not significantly affect training time. These findings provide valuable insights into the influence of activation functions and sample sizes on ANN performance, offering guidance for the selection of appropriate activation functions and sample sizes in different applications. Future research endeavours may explore the impact of other parameters on neural network performance and investigate the optimization of neural network architectures tailored to specific applications.