Generic placeholder image

Current Computer Science

Editor-in-Chief

ISSN (Print): 2950-3779
ISSN (Online): 2950-3787

Review Article

Evolutionary Perspectives on Neural Network Generations: A Critical Examination of Models and Design Strategies

In Press, (this is not the final "Version of Record"). Available online 05 April, 2024
Author(s): Jabar H. Yousif* and Mohammed J. Yousif
Published on: 05 April, 2024

Article ID: e050424228693

DOI: 10.2174/0129503779282967240315040931

Price: $95

Abstract

In the last few years, Neural Networks have become more common in different areas due to their ability to learn intricate patterns and provide precise predictions. Nonetheless, creating an efficient neural network model is a difficult task that demands careful thought of multiple factors, such as architecture, optimization method, and regularization technique. This paper aims to comprehensively overview the state-of-the-art artificial neural network (ANN) generation and highlight key challenges and opportunities in machine learning applications. It provides a critical analysis of current neural network model design methodologies, focusing on the strengths and weaknesses of different approaches. Also, it explores the use of different deep neural networks (DNN) in image recognition, natural language processing, and time series analysis. In addition, the text explores the advantages of selecting optimal values for various components of an Artificial Neural Network (ANN). These components include the number of input/output layers, the number of hidden layers, the type of activation function used, the number of epochs, and the model type selection. Setting these components to their ideal values can help enhance the model's overall performance and generalization. Furthermore, it identifies some common pitfalls and limitations of existing design methodologies, such as overfitting, lack of interpretability, and computational complexity. Finally, it proposes some directions for future research, such as developing more efficient and interpretable neural network architectures, improving the scalability of training algorithms, and exploring the potential of new paradigms, such as Spiking Neural Networks, quantum neural networks, and neuromorphic computing.


Rights & Permissions Print Cite
© 2024 Bentham Science Publishers | Privacy Policy